13830 1727204066.51910: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 13830 1727204066.52335: Added group all to inventory 13830 1727204066.52337: Added group ungrouped to inventory 13830 1727204066.52341: Group all now contains ungrouped 13830 1727204066.52344: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 13830 1727204066.72483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 13830 1727204066.72553: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 13830 1727204066.72582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 13830 1727204066.72646: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 13830 1727204066.72840: Loaded config def from plugin (inventory/script) 13830 1727204066.72842: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 13830 1727204066.72896: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 13830 1727204066.73004: Loaded config def from plugin (inventory/yaml) 13830 1727204066.73006: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 13830 1727204066.73107: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 13830 1727204066.73622: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 13830 1727204066.73626: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 13830 1727204066.73632: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 13830 1727204066.73643: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 13830 1727204066.73653: Loading data from /tmp/network-M6W/inventory-5vW.yml 13830 1727204066.73726: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 13830 1727204066.73810: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 13830 1727204066.73860: Loading data from /tmp/network-M6W/inventory-5vW.yml 13830 1727204066.73953: group all already in inventory 13830 1727204066.73960: set inventory_file for managed-node1 13830 1727204066.73970: set inventory_dir for managed-node1 13830 1727204066.73971: Added host managed-node1 to inventory 13830 1727204066.73978: Added host managed-node1 to group all 13830 1727204066.73979: set ansible_host for managed-node1 13830 1727204066.73980: set ansible_ssh_extra_args for managed-node1 13830 1727204066.73985: set inventory_file for managed-node2 13830 1727204066.73988: set inventory_dir for managed-node2 13830 1727204066.73989: Added host managed-node2 to inventory 13830 1727204066.73990: Added host managed-node2 to group all 13830 1727204066.73991: set ansible_host for managed-node2 13830 1727204066.73992: set ansible_ssh_extra_args for managed-node2 13830 1727204066.73994: set inventory_file for managed-node3 13830 1727204066.73997: set inventory_dir for managed-node3 13830 1727204066.73997: Added host managed-node3 to inventory 13830 1727204066.73999: Added host managed-node3 to group all 13830 1727204066.73999: set ansible_host for managed-node3 13830 1727204066.74000: set ansible_ssh_extra_args for managed-node3 13830 1727204066.74003: Reconcile groups and hosts in inventory. 13830 1727204066.74008: Group ungrouped now contains managed-node1 13830 1727204066.74010: Group ungrouped now contains managed-node2 13830 1727204066.74012: Group ungrouped now contains managed-node3 13830 1727204066.74107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 13830 1727204066.74250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 13830 1727204066.74310: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 13830 1727204066.74345: Loaded config def from plugin (vars/host_group_vars) 13830 1727204066.74347: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 13830 1727204066.74355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 13830 1727204066.74366: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 13830 1727204066.74425: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 13830 1727204066.74814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204066.74926: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 13830 1727204066.74982: Loaded config def from plugin (connection/local) 13830 1727204066.74986: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 13830 1727204066.75698: Loaded config def from plugin (connection/paramiko_ssh) 13830 1727204066.75702: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 13830 1727204066.76862: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13830 1727204066.76911: Loaded config def from plugin (connection/psrp) 13830 1727204066.76919: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 13830 1727204066.77933: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13830 1727204066.77981: Loaded config def from plugin (connection/ssh) 13830 1727204066.77985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 13830 1727204066.78373: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13830 1727204066.78413: Loaded config def from plugin (connection/winrm) 13830 1727204066.78417: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 13830 1727204066.78456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 13830 1727204066.78527: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 13830 1727204066.78602: Loaded config def from plugin (shell/cmd) 13830 1727204066.78604: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 13830 1727204066.78630: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 13830 1727204066.78701: Loaded config def from plugin (shell/powershell) 13830 1727204066.78704: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 13830 1727204066.78771: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 13830 1727204066.78977: Loaded config def from plugin (shell/sh) 13830 1727204066.78979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 13830 1727204066.79020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 13830 1727204066.79150: Loaded config def from plugin (become/runas) 13830 1727204066.79153: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 13830 1727204066.79388: Loaded config def from plugin (become/su) 13830 1727204066.79391: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 13830 1727204066.79581: Loaded config def from plugin (become/sudo) 13830 1727204066.79583: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 13830 1727204066.79620: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 13830 1727204066.79987: in VariableManager get_vars() 13830 1727204066.80010: done with get_vars() 13830 1727204066.80400: trying /usr/local/lib/python3.12/site-packages/ansible/modules 13830 1727204066.86867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 13830 1727204066.87316: in VariableManager get_vars() 13830 1727204066.87321: done with get_vars() 13830 1727204066.87324: variable 'playbook_dir' from source: magic vars 13830 1727204066.87325: variable 'ansible_playbook_python' from source: magic vars 13830 1727204066.87326: variable 'ansible_config_file' from source: magic vars 13830 1727204066.87327: variable 'groups' from source: magic vars 13830 1727204066.87328: variable 'omit' from source: magic vars 13830 1727204066.87329: variable 'ansible_version' from source: magic vars 13830 1727204066.87329: variable 'ansible_check_mode' from source: magic vars 13830 1727204066.87330: variable 'ansible_diff_mode' from source: magic vars 13830 1727204066.87331: variable 'ansible_forks' from source: magic vars 13830 1727204066.87331: variable 'ansible_inventory_sources' from source: magic vars 13830 1727204066.87332: variable 'ansible_skip_tags' from source: magic vars 13830 1727204066.87333: variable 'ansible_limit' from source: magic vars 13830 1727204066.87334: variable 'ansible_run_tags' from source: magic vars 13830 1727204066.87334: variable 'ansible_verbosity' from source: magic vars 13830 1727204066.87488: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml 13830 1727204066.88460: in VariableManager get_vars() 13830 1727204066.88481: done with get_vars() 13830 1727204066.88625: in VariableManager get_vars() 13830 1727204066.88640: done with get_vars() 13830 1727204066.88813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 13830 1727204066.88827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 13830 1727204066.89362: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 13830 1727204066.90109: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 13830 1727204066.90112: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 13830 1727204066.90144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 13830 1727204066.90172: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 13830 1727204066.90413: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 13830 1727204066.90474: Loaded config def from plugin (callback/default) 13830 1727204066.90477: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13830 1727204066.91678: Loaded config def from plugin (callback/junit) 13830 1727204066.91681: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13830 1727204066.91731: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 13830 1727204066.91814: Loaded config def from plugin (callback/minimal) 13830 1727204066.91817: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13830 1727204066.91855: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13830 1727204066.92141: Loaded config def from plugin (callback/tree) 13830 1727204066.92143: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 13830 1727204066.92270: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 13830 1727204066.92273: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_options_nm.yml ******************************************** 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 13830 1727204066.92310: in VariableManager get_vars() 13830 1727204066.92330: done with get_vars() 13830 1727204066.92340: in VariableManager get_vars() 13830 1727204066.92349: done with get_vars() 13830 1727204066.92354: variable 'omit' from source: magic vars 13830 1727204066.92398: in VariableManager get_vars() 13830 1727204066.92414: done with get_vars() 13830 1727204066.92442: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_options.yml' with nm as provider] ***** 13830 1727204066.93469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 13830 1727204066.93681: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 13830 1727204066.93976: getting the remaining hosts for this loop 13830 1727204066.93979: done getting the remaining hosts for this loop 13830 1727204066.93982: getting the next task for host managed-node3 13830 1727204066.93986: done getting next task for host managed-node3 13830 1727204066.93989: ^ task is: TASK: Gathering Facts 13830 1727204066.93991: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204066.93994: getting variables 13830 1727204066.93995: in VariableManager get_vars() 13830 1727204066.94007: Calling all_inventory to load vars for managed-node3 13830 1727204066.94009: Calling groups_inventory to load vars for managed-node3 13830 1727204066.94012: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204066.94025: Calling all_plugins_play to load vars for managed-node3 13830 1727204066.94036: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204066.94040: Calling groups_plugins_play to load vars for managed-node3 13830 1727204066.94085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204066.94139: done with get_vars() 13830 1727204066.94147: done getting variables 13830 1727204066.94230: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.020) 0:00:00.020 ***** 13830 1727204066.94254: entering _queue_task() for managed-node3/gather_facts 13830 1727204066.94256: Creating lock for gather_facts 13830 1727204066.94609: worker is 1 (out of 1 available) 13830 1727204066.94625: exiting _queue_task() for managed-node3/gather_facts 13830 1727204066.94639: done queuing things up, now waiting for results queue to drain 13830 1727204066.94640: waiting for pending results... 13830 1727204066.94879: running TaskExecutor() for managed-node3/TASK: Gathering Facts 13830 1727204066.94982: in run() - task 0affcd87-79f5-1659-6b02-000000000015 13830 1727204066.95006: variable 'ansible_search_path' from source: unknown 13830 1727204066.95047: calling self._execute() 13830 1727204066.95117: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204066.95130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204066.95144: variable 'omit' from source: magic vars 13830 1727204066.95251: variable 'omit' from source: magic vars 13830 1727204066.95288: variable 'omit' from source: magic vars 13830 1727204066.95333: variable 'omit' from source: magic vars 13830 1727204066.95381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204066.95427: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204066.95454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204066.95479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204066.95499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204066.95533: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204066.95546: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204066.95554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204066.95662: Set connection var ansible_connection to ssh 13830 1727204066.95681: Set connection var ansible_timeout to 10 13830 1727204066.95691: Set connection var ansible_shell_executable to /bin/sh 13830 1727204066.95698: Set connection var ansible_shell_type to sh 13830 1727204066.95708: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204066.95726: Set connection var ansible_pipelining to False 13830 1727204066.95755: variable 'ansible_shell_executable' from source: unknown 13830 1727204066.95767: variable 'ansible_connection' from source: unknown 13830 1727204066.95775: variable 'ansible_module_compression' from source: unknown 13830 1727204066.95782: variable 'ansible_shell_type' from source: unknown 13830 1727204066.95789: variable 'ansible_shell_executable' from source: unknown 13830 1727204066.95797: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204066.95804: variable 'ansible_pipelining' from source: unknown 13830 1727204066.95810: variable 'ansible_timeout' from source: unknown 13830 1727204066.95827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204066.96021: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 13830 1727204066.96038: variable 'omit' from source: magic vars 13830 1727204066.96050: starting attempt loop 13830 1727204066.96057: running the handler 13830 1727204066.96078: variable 'ansible_facts' from source: unknown 13830 1727204066.96105: _low_level_execute_command(): starting 13830 1727204066.96116: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204066.96991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204066.97008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204066.97029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204066.97050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204066.97101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204066.97114: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204066.97129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204066.97153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204066.97169: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204066.97182: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204066.97196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204066.97215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204066.97232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204066.97249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204066.97263: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204066.97280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204066.97366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204066.97385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204066.97400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204066.97555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204066.99151: stdout chunk (state=3): >>>/root <<< 13830 1727204066.99358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204066.99362: stdout chunk (state=3): >>><<< 13830 1727204066.99367: stderr chunk (state=3): >>><<< 13830 1727204066.99491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204066.99494: _low_level_execute_command(): starting 13830 1727204066.99497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930 `" && echo ansible-tmp-1727204066.9939063-13959-135049366041930="` echo /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930 `" ) && sleep 0' 13830 1727204067.01173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204067.01177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204067.01285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204067.01289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204067.01291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204067.01440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204067.01458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204067.01516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204067.03362: stdout chunk (state=3): >>>ansible-tmp-1727204066.9939063-13959-135049366041930=/root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930 <<< 13830 1727204067.03475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204067.03558: stderr chunk (state=3): >>><<< 13830 1727204067.03561: stdout chunk (state=3): >>><<< 13830 1727204067.03774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204066.9939063-13959-135049366041930=/root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204067.03777: variable 'ansible_module_compression' from source: unknown 13830 1727204067.03780: ANSIBALLZ: Using generic lock for ansible.legacy.setup 13830 1727204067.03782: ANSIBALLZ: Acquiring lock 13830 1727204067.03784: ANSIBALLZ: Lock acquired: 140043657885840 13830 1727204067.03786: ANSIBALLZ: Creating module 13830 1727204067.87329: ANSIBALLZ: Writing module into payload 13830 1727204067.87739: ANSIBALLZ: Writing module 13830 1727204067.87769: ANSIBALLZ: Renaming module 13830 1727204067.87837: ANSIBALLZ: Done creating module 13830 1727204067.87874: variable 'ansible_facts' from source: unknown 13830 1727204067.87877: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204067.87888: _low_level_execute_command(): starting 13830 1727204067.87894: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 13830 1727204067.89892: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204067.90019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204067.90029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204067.90046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204067.90136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204067.90143: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204067.90153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204067.90168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204067.90176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204067.90183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204067.90191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204067.90200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204067.90211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204067.90229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204067.90279: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204067.90289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204067.90476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204067.90495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204067.90507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204067.90597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204067.92224: stdout chunk (state=3): >>>PLATFORM <<< 13830 1727204067.92307: stdout chunk (state=3): >>>Linux <<< 13830 1727204067.92319: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 13830 1727204067.92538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204067.92542: stdout chunk (state=3): >>><<< 13830 1727204067.92549: stderr chunk (state=3): >>><<< 13830 1727204067.92573: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204067.92584 [managed-node3]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 13830 1727204067.92625: _low_level_execute_command(): starting 13830 1727204067.92628: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 13830 1727204067.93319: Sending initial data 13830 1727204067.93321: Sent initial data (1181 bytes) 13830 1727204067.94312: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204067.94467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204067.94484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204067.94504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204067.94547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204067.94566: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204067.94582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204067.94599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204067.94610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204067.94621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204067.94633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204067.94647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204067.94665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204067.94684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204067.94696: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204067.94710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204067.94790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204067.94913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204067.94928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204067.95124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204067.98800: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 13830 1727204067.99186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204067.99475: stderr chunk (state=3): >>><<< 13830 1727204067.99479: stdout chunk (state=3): >>><<< 13830 1727204067.99482: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204067.99484: variable 'ansible_facts' from source: unknown 13830 1727204067.99486: variable 'ansible_facts' from source: unknown 13830 1727204067.99488: variable 'ansible_module_compression' from source: unknown 13830 1727204067.99490: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13830 1727204067.99492: variable 'ansible_facts' from source: unknown 13830 1727204067.99631: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930/AnsiballZ_setup.py 13830 1727204068.00749: Sending initial data 13830 1727204068.00753: Sent initial data (154 bytes) 13830 1727204068.03398: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204068.03417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.03433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.03453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.03504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.03517: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204068.03535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.03554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204068.03580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204068.03712: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204068.03834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.03849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.03945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.03960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.03977: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204068.03993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.04077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204068.04162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204068.04182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204068.04379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204068.06085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204068.06135: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204068.06176: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpq4i6f5cm /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930/AnsiballZ_setup.py <<< 13830 1727204068.06215: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204068.09282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204068.09286: stderr chunk (state=3): >>><<< 13830 1727204068.09289: stdout chunk (state=3): >>><<< 13830 1727204068.09291: done transferring module to remote 13830 1727204068.09293: _low_level_execute_command(): starting 13830 1727204068.09295: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930/ /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930/AnsiballZ_setup.py && sleep 0' 13830 1727204068.09968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204068.09994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.10011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.10031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.10075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.10094: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204068.10109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.10127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204068.10140: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204068.10152: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204068.10167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.10182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.10204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.10218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.10230: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204068.10244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.10328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204068.10346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204068.10360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204068.10438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204068.12300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204068.12304: stdout chunk (state=3): >>><<< 13830 1727204068.12306: stderr chunk (state=3): >>><<< 13830 1727204068.12312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204068.12314: _low_level_execute_command(): starting 13830 1727204068.12316: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930/AnsiballZ_setup.py && sleep 0' 13830 1727204068.12953: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204068.12957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.12959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.13104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.13107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.13110: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204068.13112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.13114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204068.13116: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204068.13118: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204068.13120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.13121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.13123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.13125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.13127: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204068.13128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.13190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204068.13207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204068.13219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204068.13301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204068.15184: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 13830 1727204068.15190: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 13830 1727204068.15256: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 13830 1727204068.15304: stdout chunk (state=3): >>>import 'posix' # <<< 13830 1727204068.15338: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13830 1727204068.15383: stdout chunk (state=3): >>>import 'time' # <<< 13830 1727204068.15392: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13830 1727204068.15446: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.15477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 13830 1727204068.15508: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90fefdc0> <<< 13830 1727204068.15587: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 13830 1727204068.15591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90fefb20> <<< 13830 1727204068.15615: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90fefac0> <<< 13830 1727204068.15644: stdout chunk (state=3): >>>import '_signal' # <<< 13830 1727204068.15681: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 13830 1727204068.15684: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8490> <<< 13830 1727204068.15729: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 13830 1727204068.15753: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8940> <<< 13830 1727204068.15768: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8670> <<< 13830 1727204068.15840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 13830 1727204068.15870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 13830 1727204068.15885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13830 1727204068.15913: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b8f190> <<< 13830 1727204068.15945: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13830 1727204068.15958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13830 1727204068.16023: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b8f220> <<< 13830 1727204068.16086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 13830 1727204068.16090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b8f940> <<< 13830 1727204068.16141: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bf0880> <<< 13830 1727204068.16168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b88d90> <<< 13830 1727204068.16219: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 13830 1727204068.16222: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bb2d90> <<< 13830 1727204068.16292: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8970> <<< 13830 1727204068.16305: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13830 1727204068.16632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 13830 1727204068.16683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13830 1727204068.16708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 13830 1727204068.16736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 13830 1727204068.16756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b2ef10> <<< 13830 1727204068.16813: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b340a0> <<< 13830 1727204068.16842: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13830 1727204068.16875: stdout chunk (state=3): >>>import '_sre' # <<< 13830 1727204068.16911: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 13830 1727204068.16940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 13830 1727204068.16975: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b275b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b2f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b2e3d0> <<< 13830 1727204068.16988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 13830 1727204068.17055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 13830 1727204068.17079: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 13830 1727204068.17131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.17182: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 13830 1727204068.17186: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f907d6e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907d6940> <<< 13830 1727204068.17249: stdout chunk (state=3): >>>import 'itertools' # <<< 13830 1727204068.17253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907d6f40> <<< 13830 1727204068.17294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 13830 1727204068.17309: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907d6d90> <<< 13830 1727204068.17331: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7100> import '_collections' # <<< 13830 1727204068.17387: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b09dc0> import '_functools' # <<< 13830 1727204068.17414: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b026a0> <<< 13830 1727204068.17503: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b15700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b35eb0> <<< 13830 1727204068.17506: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13830 1727204068.17536: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f907e7d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b092e0> <<< 13830 1727204068.17587: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f90b15310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b3ba60> <<< 13830 1727204068.17619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 13830 1727204068.17654: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.17690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 13830 1727204068.17724: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7e20> <<< 13830 1727204068.17758: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7d90> <<< 13830 1727204068.17778: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13830 1727204068.17791: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13830 1727204068.17820: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13830 1727204068.17863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13830 1727204068.17901: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907ba400> <<< 13830 1727204068.17928: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13830 1727204068.17959: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907ba4f0> <<< 13830 1727204068.18081: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907eff70> <<< 13830 1727204068.18152: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e9ac0> <<< 13830 1727204068.18156: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e9490> <<< 13830 1727204068.18188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 13830 1727204068.18208: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 13830 1727204068.18256: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 13830 1727204068.18273: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90708250> <<< 13830 1727204068.18291: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907a5550> <<< 13830 1727204068.18335: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e9f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b3b0d0> <<< 13830 1727204068.18374: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13830 1727204068.18419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 13830 1727204068.18434: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9071ab80> import 'errno' # <<< 13830 1727204068.18478: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9071aeb0> <<< 13830 1727204068.18507: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13830 1727204068.18543: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9072b7c0> <<< 13830 1727204068.18555: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13830 1727204068.18584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13830 1727204068.18610: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9072bd00> <<< 13830 1727204068.18654: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906c5430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9071afa0> <<< 13830 1727204068.18681: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13830 1727204068.18740: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906d5310> <<< 13830 1727204068.18776: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9072b640> import 'pwd' # <<< 13830 1727204068.18788: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906d53d0> <<< 13830 1727204068.18833: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7a60> <<< 13830 1727204068.18870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13830 1727204068.18892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 13830 1727204068.18930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13830 1727204068.18954: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f1730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13830 1727204068.18988: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f1a00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906f17f0> <<< 13830 1727204068.19020: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f18e0> <<< 13830 1727204068.19049: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13830 1727204068.19247: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f1d30> <<< 13830 1727204068.19282: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906fb280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906f1970> <<< 13830 1727204068.19308: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906e4ac0> <<< 13830 1727204068.19332: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7640> <<< 13830 1727204068.19353: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13830 1727204068.19413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13830 1727204068.19442: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906f1b20> <<< 13830 1727204068.19607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8f9060b700> <<< 13830 1727204068.19866: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 13830 1727204068.19948: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.20019: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 13830 1727204068.20034: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.21233: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.22188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.22220: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 13830 1727204068.22260: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13830 1727204068.22290: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9054a160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a280> <<< 13830 1727204068.22333: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054afa0> <<< 13830 1727204068.22350: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13830 1727204068.22400: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054adc0> import 'atexit' # <<< 13830 1727204068.22441: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9054a580> <<< 13830 1727204068.22454: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13830 1727204068.22490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13830 1727204068.22520: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a100> <<< 13830 1727204068.22546: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13830 1727204068.22591: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13830 1727204068.22623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13830 1727204068.22714: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffedf70> <<< 13830 1727204068.22744: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff08370> <<< 13830 1727204068.22779: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff08070> <<< 13830 1727204068.22803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 13830 1727204068.22846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13830 1727204068.22861: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff08cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90532dc0> <<< 13830 1727204068.23039: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f905323a0> <<< 13830 1727204068.23078: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90532f40> <<< 13830 1727204068.23097: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13830 1727204068.23131: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 13830 1727204068.23186: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13830 1727204068.23198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9057ff40> <<< 13830 1727204068.23278: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90551d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90551430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffe1af0> <<< 13830 1727204068.23310: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f90551550> <<< 13830 1727204068.23339: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90551580> <<< 13830 1727204068.23368: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 13830 1727204068.23396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 13830 1727204068.23428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 13830 1727204068.23511: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff76fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90591280> <<< 13830 1727204068.23533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 13830 1727204068.23584: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff74820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90591400> <<< 13830 1727204068.23620: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 13830 1727204068.23648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.23681: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 13830 1727204068.23692: stdout chunk (state=3): >>>import '_string' # <<< 13830 1727204068.23748: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90591c40> <<< 13830 1727204068.23872: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff747c0> <<< 13830 1727204068.23965: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9052a1c0> <<< 13830 1727204068.23995: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f905919d0> <<< 13830 1727204068.24051: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f90591550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9058a940> <<< 13830 1727204068.24086: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13830 1727204068.24108: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 13830 1727204068.24119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13830 1727204068.24167: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff68910> <<< 13830 1727204068.24339: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff85dc0> <<< 13830 1727204068.24351: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff72550> <<< 13830 1727204068.24420: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff68eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff72970> # zipimport: zlib available # zipimport: zlib available <<< 13830 1727204068.24436: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 13830 1727204068.24512: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.24584: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13830 1727204068.24641: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 13830 1727204068.24657: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.24747: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.24845: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.25293: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.25755: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 13830 1727204068.25790: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 13830 1727204068.25804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.25852: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ffae7f0> <<< 13830 1727204068.25938: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffb38b0> <<< 13830 1727204068.25949: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8faf0940> <<< 13830 1727204068.25987: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 13830 1727204068.26006: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.26045: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 13830 1727204068.26159: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.26290: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13830 1727204068.26319: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffec730> <<< 13830 1727204068.26332: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.26716: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27079: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27139: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27202: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 13830 1727204068.27215: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27246: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27280: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 13830 1727204068.27292: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27348: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27410: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 13830 1727204068.27454: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 13830 1727204068.27479: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27491: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27530: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 13830 1727204068.27719: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.27912: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13830 1727204068.27943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 13830 1727204068.28027: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054d2e0> # zipimport: zlib available <<< 13830 1727204068.28087: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28162: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 13830 1727204068.28183: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 13830 1727204068.28194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28233: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28267: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 13830 1727204068.28279: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28313: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28354: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28442: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28502: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13830 1727204068.28534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.28600: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ffa5880> <<< 13830 1727204068.28697: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f96c550> <<< 13830 1727204068.28738: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 13830 1727204068.28753: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28800: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28847: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28877: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.28923: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 13830 1727204068.28945: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 13830 1727204068.28956: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13830 1727204068.28983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13830 1727204068.29014: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 13830 1727204068.29036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13830 1727204068.29119: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffb6910> <<< 13830 1727204068.29152: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9051c970> <<< 13830 1727204068.29216: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffea850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 13830 1727204068.29242: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29270: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29281: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 13830 1727204068.29370: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 13830 1727204068.29390: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 13830 1727204068.29450: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29505: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29526: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29541: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29570: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29602: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29633: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29671: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 13830 1727204068.29683: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29737: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29804: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29819: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.29851: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 13830 1727204068.30000: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.30135: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.30169: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.30221: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.30262: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 13830 1727204068.30282: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 13830 1727204068.30312: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f871c70> <<< 13830 1727204068.30341: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 13830 1727204068.30359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 13830 1727204068.30378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 13830 1727204068.30417: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 13830 1727204068.30436: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8facfa30> <<< 13830 1727204068.30467: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8facf9a0> <<< 13830 1727204068.30534: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb1db20> <<< 13830 1727204068.30576: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb1d550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb052e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb05970> <<< 13830 1727204068.30601: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 13830 1727204068.30634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 13830 1727204068.30670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 13830 1727204068.30697: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8fab62b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fab6a00> <<< 13830 1727204068.30709: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 13830 1727204068.30742: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fab6940> <<< 13830 1727204068.30761: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 13830 1727204068.30788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 13830 1727204068.30800: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f8d20d0> <<< 13830 1727204068.30844: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffa13a0> <<< 13830 1727204068.30891: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb05670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 13830 1727204068.30915: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 13830 1727204068.30952: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31007: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 13830 1727204068.31049: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31098: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 13830 1727204068.31134: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 13830 1727204068.31156: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31187: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 13830 1727204068.31239: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31274: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 13830 1727204068.31313: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31358: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 13830 1727204068.31409: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31467: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31504: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31560: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 13830 1727204068.31581: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.31950: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32317: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 13830 1727204068.32320: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32358: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32409: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32434: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32477: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 13830 1727204068.32506: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32532: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 13830 1727204068.32592: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32647: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 13830 1727204068.32650: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32680: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32700: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 13830 1727204068.32739: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32754: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 13830 1727204068.32815: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.32885: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 13830 1727204068.32941: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7c1eb0> <<< 13830 1727204068.32945: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 13830 1727204068.32959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 13830 1727204068.33113: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7c19d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 13830 1727204068.33117: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.33163: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.33221: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 13830 1727204068.33301: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.33381: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 13830 1727204068.33436: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.33506: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 13830 1727204068.33542: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.33598: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 13830 1727204068.33616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 13830 1727204068.33745: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f82dbb0> <<< 13830 1727204068.33987: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7d0a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 13830 1727204068.33991: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34035: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34085: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 13830 1727204068.34088: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34154: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34227: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34313: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34453: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 13830 1727204068.34487: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34529: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 13830 1727204068.34532: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34568: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34611: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 13830 1727204068.34679: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f834040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f8346d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 13830 1727204068.34705: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 13830 1727204068.34735: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34771: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 13830 1727204068.34790: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.34908: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35035: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 13830 1727204068.35130: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35226: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35274: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 13830 1727204068.35363: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35378: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35482: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35607: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 13830 1727204068.35716: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35827: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 13830 1727204068.35835: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35850: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.35884: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.36309: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.37259: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available <<< 13830 1727204068.37369: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 13830 1727204068.37373: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.37563: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.37761: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 13830 1727204068.37780: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.37783: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 13830 1727204068.37806: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.37847: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.38602: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13830 1727204068.38749: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py<<< 13830 1727204068.38768: stdout chunk (state=3): >>> <<< 13830 1727204068.38772: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py<<< 13830 1727204068.38774: stdout chunk (state=3): >>> <<< 13830 1727204068.38809: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204068.38815: stdout chunk (state=3): >>> <<< 13830 1727204068.38866: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.38931: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 13830 1727204068.38939: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.38985: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39000: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 13830 1727204068.39022: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39081: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39145: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 13830 1727204068.39179: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39212: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 13830 1727204068.39216: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39250: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39313: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 13830 1727204068.39360: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39419: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 13830 1727204068.39427: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39636: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39852: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 13830 1727204068.39856: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.39926: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.40321: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available <<< 13830 1727204068.40394: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 13830 1727204068.40422: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.40426: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.40438: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 13830 1727204068.40440: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.40495: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204068.41004: stdout chunk (state=3): >>> import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available <<< 13830 1727204068.41007: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 13830 1727204068.41035: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.41301: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.41568: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 13830 1727204068.41592: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.41656: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.41724: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 13830 1727204068.41748: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.41820: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.41883: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 13830 1727204068.41904: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.42013: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.42123: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 13830 1727204068.42146: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 13830 1727204068.42171: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204068.42283: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204068.42288: stdout chunk (state=3): >>> <<< 13830 1727204068.42411: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py<<< 13830 1727204068.42427: stdout chunk (state=3): >>> import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py<<< 13830 1727204068.42432: stdout chunk (state=3): >>> import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py<<< 13830 1727204068.42435: stdout chunk (state=3): >>> <<< 13830 1727204068.42543: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204068.42551: stdout chunk (state=3): >>> <<< 13830 1727204068.43128: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 13830 1727204068.43149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 13830 1727204068.43157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 13830 1727204068.43173: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f7b6eb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7b6b80> <<< 13830 1727204068.43217: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f763d30> <<< 13830 1727204068.43609: stdout chunk (state=3): >>>import 'gc' # <<< 13830 1727204068.45859: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 13830 1727204068.45901: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7b6220> <<< 13830 1727204068.45923: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7781f0> <<< 13830 1727204068.46007: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204068.46025: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f5caa00> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f5ca0d0> <<< 13830 1727204068.46248: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 13830 1727204068.72079: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.47, "5m": 0.3, "15m": 0.13}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4<<< 13830 1727204068.72101: stdout chunk (state=3): >>>Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 414, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264283635712, "block_size": 4096, "block_total": 65519355, "block_available": 64522372, "block_used": 996983, "inode_total": 131071472, "inode_available": 130998315, "inode_used": 73157, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_date_time": {"<<< 13830 1727204068.72135: stdout chunk (state=3): >>>year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "28", "epoch": "1727204068", "epoch_int": "1727204068", "date": "2024-09-24", "time": "14:54:28", "iso8601_micro": "2024-09-24T18:54:28.662232Z", "iso8601": "2024-09-24T18:54:28Z", "iso8601_basic": "20240924T145428662232", "iso8601_basic_short": "20240924T145428", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive<<< 13830 1727204068.72163: stdout chunk (state=3): >>>_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-<<< 13830 1727204068.72510: stdout chunk (state=3): >>>64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13830 1727204068.72887: stdout chunk (state=3): >>># clear builtins._ <<< 13830 1727204068.72894: stdout chunk (state=3): >>># clear sys.path<<< 13830 1727204068.72958: stdout chunk (state=3): >>> <<< 13830 1727204068.72976: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1<<< 13830 1727204068.72992: stdout chunk (state=3): >>> # clear sys.ps2<<< 13830 1727204068.73004: stdout chunk (state=3): >>> # clear sys.last_type <<< 13830 1727204068.73009: stdout chunk (state=3): >>># clear sys.last_value<<< 13830 1727204068.73011: stdout chunk (state=3): >>> <<< 13830 1727204068.73013: stdout chunk (state=3): >>># clear sys.last_traceback <<< 13830 1727204068.73015: stdout chunk (state=3): >>># clear sys.path_hooks <<< 13830 1727204068.73018: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 13830 1727204068.73049: stdout chunk (state=3): >>># clear sys.meta_path <<< 13830 1727204068.73052: stdout chunk (state=3): >>># clear sys.__interactivehook__<<< 13830 1727204068.73083: stdout chunk (state=3): >>> <<< 13830 1727204068.73089: stdout chunk (state=3): >>># restore sys.stdin <<< 13830 1727204068.73100: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr <<< 13830 1727204068.73128: stdout chunk (state=3): >>># cleanup[2] removing sys <<< 13830 1727204068.73151: stdout chunk (state=3): >>># cleanup[2] removing builtins <<< 13830 1727204068.73176: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib<<< 13830 1727204068.73202: stdout chunk (state=3): >>> # cleanup[2] removing _imp <<< 13830 1727204068.73222: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 13830 1727204068.73239: stdout chunk (state=3): >>># cleanup[2] removing _weakref <<< 13830 1727204068.73265: stdout chunk (state=3): >>># cleanup[2] removing _io <<< 13830 1727204068.73270: stdout chunk (state=3): >>># cleanup[2] removing marshal <<< 13830 1727204068.73272: stdout chunk (state=3): >>># cleanup[2] removing posix<<< 13830 1727204068.73277: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib_external<<< 13830 1727204068.73307: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport<<< 13830 1727204068.73320: stdout chunk (state=3): >>> # cleanup[2] removing _codecs <<< 13830 1727204068.73323: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases<<< 13830 1727204068.73333: stdout chunk (state=3): >>> # cleanup[2] removing encodings <<< 13830 1727204068.73366: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 <<< 13830 1727204068.73375: stdout chunk (state=3): >>># cleanup[2] removing _signal <<< 13830 1727204068.73381: stdout chunk (state=3): >>># cleanup[2] removing encodings.latin_1 <<< 13830 1727204068.73431: stdout chunk (state=3): >>># cleanup[2] removing _abc # cleanup[2] removing abc<<< 13830 1727204068.73438: stdout chunk (state=3): >>> <<< 13830 1727204068.73443: stdout chunk (state=3): >>># cleanup[2] removing io<<< 13830 1727204068.73447: stdout chunk (state=3): >>> # cleanup[2] removing __main__ <<< 13830 1727204068.73457: stdout chunk (state=3): >>># cleanup[2] removing _stat <<< 13830 1727204068.73476: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 13830 1727204068.73488: stdout chunk (state=3): >>># cleanup[2] removing genericpath <<< 13830 1727204068.73509: stdout chunk (state=3): >>># cleanup[2] removing posixpath <<< 13830 1727204068.73524: stdout chunk (state=3): >>># cleanup[2] removing os.path <<< 13830 1727204068.73567: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 13830 1727204068.73582: stdout chunk (state=3): >>># cleanup[2] removing _locale <<< 13830 1727204068.73614: stdout chunk (state=3): >>># cleanup[2] removing _bootlocale<<< 13830 1727204068.73618: stdout chunk (state=3): >>> <<< 13830 1727204068.73638: stdout chunk (state=3): >>># destroy _bootlocale <<< 13830 1727204068.73642: stdout chunk (state=3): >>># cleanup[2] removing site<<< 13830 1727204068.73645: stdout chunk (state=3): >>> <<< 13830 1727204068.73647: stdout chunk (state=3): >>># destroy site<<< 13830 1727204068.73652: stdout chunk (state=3): >>> <<< 13830 1727204068.73669: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre <<< 13830 1727204068.73700: stdout chunk (state=3): >>># cleanup[2] removing sre_constants <<< 13830 1727204068.73704: stdout chunk (state=3): >>># destroy sre_constants <<< 13830 1727204068.73720: stdout chunk (state=3): >>># cleanup[2] removing sre_parse # cleanup[2] removing sre_compile<<< 13830 1727204068.73778: stdout chunk (state=3): >>> # cleanup[2] removing _heapq <<< 13830 1727204068.73806: stdout chunk (state=3): >>># cleanup[2] removing heapq <<< 13830 1727204068.73835: stdout chunk (state=3): >>># cleanup[2] removing itertools <<< 13830 1727204068.73859: stdout chunk (state=3): >>># cleanup[2] removing keyword<<< 13830 1727204068.73872: stdout chunk (state=3): >>> # destroy keyword<<< 13830 1727204068.73879: stdout chunk (state=3): >>> <<< 13830 1727204068.73881: stdout chunk (state=3): >>># cleanup[2] removing _operator<<< 13830 1727204068.73882: stdout chunk (state=3): >>> # cleanup[2] removing operator<<< 13830 1727204068.73883: stdout chunk (state=3): >>> # cleanup[2] removing reprlib<<< 13830 1727204068.73885: stdout chunk (state=3): >>> <<< 13830 1727204068.73886: stdout chunk (state=3): >>># destroy reprlib <<< 13830 1727204068.73887: stdout chunk (state=3): >>># cleanup[2] removing _collections <<< 13830 1727204068.73888: stdout chunk (state=3): >>># cleanup[2] removing collections<<< 13830 1727204068.73889: stdout chunk (state=3): >>> <<< 13830 1727204068.73895: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg<<< 13830 1727204068.73910: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing<<< 13830 1727204068.73930: stdout chunk (state=3): >>> # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression<<< 13830 1727204068.73954: stdout chunk (state=3): >>> # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 13830 1727204068.73973: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils<<< 13830 1727204068.74006: stdout chunk (state=3): >>> # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal<<< 13830 1727204068.74009: stdout chunk (state=3): >>> # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token<<< 13830 1727204068.74036: stdout chunk (state=3): >>> # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid<<< 13830 1727204068.74046: stdout chunk (state=3): >>> # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal<<< 13830 1727204068.74076: stdout chunk (state=3): >>> # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text<<< 13830 1727204068.74083: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters<<< 13830 1727204068.74111: stdout chunk (state=3): >>> # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings<<< 13830 1727204068.74136: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation<<< 13830 1727204068.74167: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro <<< 13830 1727204068.74216: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing<<< 13830 1727204068.74270: stdout chunk (state=3): >>> # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass<<< 13830 1727204068.74300: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin<<< 13830 1727204068.74316: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd<<< 13830 1727204068.74346: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat<<< 13830 1727204068.74353: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor<<< 13830 1727204068.74383: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr<<< 13830 1727204068.74408: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl<<< 13830 1727204068.74444: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux<<< 13830 1727204068.74599: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 13830 1727204068.74857: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 13830 1727204068.74863: stdout chunk (state=3): >>> <<< 13830 1727204068.74901: stdout chunk (state=3): >>># destroy importlib.util<<< 13830 1727204068.74904: stdout chunk (state=3): >>> # destroy importlib.abc<<< 13830 1727204068.74933: stdout chunk (state=3): >>> <<< 13830 1727204068.74936: stdout chunk (state=3): >>># destroy importlib.machinery<<< 13830 1727204068.74939: stdout chunk (state=3): >>> <<< 13830 1727204068.74990: stdout chunk (state=3): >>># destroy zipimport<<< 13830 1727204068.74993: stdout chunk (state=3): >>> <<< 13830 1727204068.75015: stdout chunk (state=3): >>># destroy _compression<<< 13830 1727204068.75027: stdout chunk (state=3): >>> <<< 13830 1727204068.75070: stdout chunk (state=3): >>># destroy binascii <<< 13830 1727204068.75074: stdout chunk (state=3): >>># destroy importlib<<< 13830 1727204068.75077: stdout chunk (state=3): >>> # destroy bz2 <<< 13830 1727204068.75082: stdout chunk (state=3): >>># destroy lzma <<< 13830 1727204068.75131: stdout chunk (state=3): >>># destroy __main__<<< 13830 1727204068.75153: stdout chunk (state=3): >>> # destroy locale<<< 13830 1727204068.75172: stdout chunk (state=3): >>> # destroy systemd.journal<<< 13830 1727204068.75191: stdout chunk (state=3): >>> # destroy systemd.daemon <<< 13830 1727204068.75222: stdout chunk (state=3): >>># destroy hashlib<<< 13830 1727204068.75230: stdout chunk (state=3): >>> # destroy json.decoder<<< 13830 1727204068.75249: stdout chunk (state=3): >>> # destroy json.encoder<<< 13830 1727204068.75252: stdout chunk (state=3): >>> <<< 13830 1727204068.75255: stdout chunk (state=3): >>># destroy json.scanner <<< 13830 1727204068.75257: stdout chunk (state=3): >>># destroy _json <<< 13830 1727204068.75258: stdout chunk (state=3): >>># destroy encodings<<< 13830 1727204068.75259: stdout chunk (state=3): >>> <<< 13830 1727204068.75306: stdout chunk (state=3): >>># destroy syslog<<< 13830 1727204068.75309: stdout chunk (state=3): >>> # destroy uuid<<< 13830 1727204068.75315: stdout chunk (state=3): >>> <<< 13830 1727204068.75372: stdout chunk (state=3): >>># destroy selinux<<< 13830 1727204068.75397: stdout chunk (state=3): >>> # destroy distro<<< 13830 1727204068.75404: stdout chunk (state=3): >>> # destroy logging <<< 13830 1727204068.75407: stdout chunk (state=3): >>># destroy argparse <<< 13830 1727204068.75469: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 13830 1727204068.75472: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector <<< 13830 1727204068.75500: stdout chunk (state=3): >>># destroy multiprocessing <<< 13830 1727204068.75538: stdout chunk (state=3): >>># destroy multiprocessing.queues <<< 13830 1727204068.75562: stdout chunk (state=3): >>># destroy multiprocessing.synchronize<<< 13830 1727204068.75568: stdout chunk (state=3): >>> # destroy multiprocessing.dummy <<< 13830 1727204068.75571: stdout chunk (state=3): >>># destroy multiprocessing.pool <<< 13830 1727204068.75573: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle<<< 13830 1727204068.75574: stdout chunk (state=3): >>> <<< 13830 1727204068.75605: stdout chunk (state=3): >>># destroy queue <<< 13830 1727204068.75624: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 13830 1727204068.75656: stdout chunk (state=3): >>># destroy shlex <<< 13830 1727204068.75685: stdout chunk (state=3): >>># destroy datetime <<< 13830 1727204068.75716: stdout chunk (state=3): >>># destroy base64 <<< 13830 1727204068.75759: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 13830 1727204068.75766: stdout chunk (state=3): >>># destroy json <<< 13830 1727204068.75788: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 13830 1727204068.75817: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing <<< 13830 1727204068.75820: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection <<< 13830 1727204068.75824: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 13830 1727204068.75898: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios <<< 13830 1727204068.75916: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 13830 1727204068.75942: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 13830 1727204068.75946: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 13830 1727204068.75948: stdout chunk (state=3): >>># destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 13830 1727204068.75985: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess <<< 13830 1727204068.75988: stdout chunk (state=3): >>># cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 13830 1727204068.75990: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 13830 1727204068.76000: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 13830 1727204068.76041: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 13830 1727204068.76059: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 13830 1727204068.76067: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 13830 1727204068.76098: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat <<< 13830 1727204068.76131: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 13830 1727204068.76135: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 13830 1727204068.76142: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 13830 1727204068.76158: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 13830 1727204068.76200: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios <<< 13830 1727204068.76204: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 13830 1727204068.76427: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 13830 1727204068.76440: stdout chunk (state=3): >>># destroy tokenize <<< 13830 1727204068.76455: stdout chunk (state=3): >>># destroy _heapq <<< 13830 1727204068.76463: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 13830 1727204068.76482: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 13830 1727204068.76508: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse <<< 13830 1727204068.76514: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 13830 1727204068.76544: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13830 1727204068.76591: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13830 1727204068.77061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204068.77125: stderr chunk (state=3): >>><<< 13830 1727204068.77132: stdout chunk (state=3): >>><<< 13830 1727204068.77255: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90fefdc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90fefb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90fefac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b8f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b8f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b8f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bf0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b88d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bb2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90bd8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b2ef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b340a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b275b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b2f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b2e3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f907d6e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907d6940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907d6f40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907d6d90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b09dc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b026a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b15700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b35eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f907e7d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b092e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f90b15310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b3ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7e20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7d90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907ba400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907ba4f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907eff70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e9ac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e9490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90708250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907a5550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e9f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90b3b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9071ab80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9071aeb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9072b7c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9072bd00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906c5430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9071afa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906d5310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9072b640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906d53d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7a60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f1730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f1a00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906f17f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f18e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906f1d30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f906fb280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906f1970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906e4ac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f907e7640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f906f1b20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8f9060b700> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9054a160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054afa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054adc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9054a580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054a100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffedf70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff08370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff08070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff08cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90532dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f905323a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90532f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9057ff40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90551d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90551430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffe1af0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f90551550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90551580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff76fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90591280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff74820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90591400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f90591c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff747c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f9052a1c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f905919d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f90591550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9058a940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff68910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff85dc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff72550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ff68eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ff72970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ffae7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffb38b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8faf0940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffec730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9054d2e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8ffa5880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f96c550> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffb6910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f9051c970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffea850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f871c70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8facfa30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8facf9a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb1db20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb1d550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb052e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb05970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8fab62b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fab6a00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fab6940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f8d20d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8ffa13a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8fb05670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7c1eb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7c19d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f82dbb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7d0a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f834040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f8346d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_yv8n9ls8/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8f8f7b6eb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7b6b80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f763d30> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7b6220> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f7781f0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f5caa00> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8f8f5ca0d0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.47, "5m": 0.3, "15m": 0.13}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2813, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 719, "free": 2813}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 414, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264283635712, "block_size": 4096, "block_total": 65519355, "block_available": 64522372, "block_used": 996983, "inode_total": 131071472, "inode_available": 130998315, "inode_used": 73157, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "28", "epoch": "1727204068", "epoch_int": "1727204068", "date": "2024-09-24", "time": "14:54:28", "iso8601_micro": "2024-09-24T18:54:28.662232Z", "iso8601": "2024-09-24T18:54:28Z", "iso8601_basic": "20240924T145428662232", "iso8601_basic_short": "20240924T145428", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 13830 1727204068.78163: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204068.78167: _low_level_execute_command(): starting 13830 1727204068.78171: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204066.9939063-13959-135049366041930/ > /dev/null 2>&1 && sleep 0' 13830 1727204068.78314: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.78317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.78351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.78354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.78356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.78420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204068.78424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204068.78430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204068.78473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204068.80992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204068.81065: stderr chunk (state=3): >>><<< 13830 1727204068.81071: stdout chunk (state=3): >>><<< 13830 1727204068.81083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204068.81091: handler run complete 13830 1727204068.81175: variable 'ansible_facts' from source: unknown 13830 1727204068.81240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204068.81439: variable 'ansible_facts' from source: unknown 13830 1727204068.81498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204068.81573: attempt loop complete, returning result 13830 1727204068.81577: _execute() done 13830 1727204068.81579: dumping result to json 13830 1727204068.81599: done dumping result, returning 13830 1727204068.81609: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-1659-6b02-000000000015] 13830 1727204068.81611: sending task result for task 0affcd87-79f5-1659-6b02-000000000015 ok: [managed-node3] 13830 1727204068.82126: no more pending results, returning what we have 13830 1727204068.82128: results queue empty 13830 1727204068.82129: checking for any_errors_fatal 13830 1727204068.82130: done checking for any_errors_fatal 13830 1727204068.82131: checking for max_fail_percentage 13830 1727204068.82132: done checking for max_fail_percentage 13830 1727204068.82132: checking to see if all hosts have failed and the running result is not ok 13830 1727204068.82133: done checking to see if all hosts have failed 13830 1727204068.82133: getting the remaining hosts for this loop 13830 1727204068.82135: done getting the remaining hosts for this loop 13830 1727204068.82137: getting the next task for host managed-node3 13830 1727204068.82141: done getting next task for host managed-node3 13830 1727204068.82143: ^ task is: TASK: meta (flush_handlers) 13830 1727204068.82144: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204068.82147: getting variables 13830 1727204068.82148: in VariableManager get_vars() 13830 1727204068.82168: Calling all_inventory to load vars for managed-node3 13830 1727204068.82170: Calling groups_inventory to load vars for managed-node3 13830 1727204068.82172: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204068.82178: done sending task result for task 0affcd87-79f5-1659-6b02-000000000015 13830 1727204068.82181: WORKER PROCESS EXITING 13830 1727204068.82190: Calling all_plugins_play to load vars for managed-node3 13830 1727204068.82192: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204068.82195: Calling groups_plugins_play to load vars for managed-node3 13830 1727204068.82307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204068.82423: done with get_vars() 13830 1727204068.82432: done getting variables 13830 1727204068.82475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 13830 1727204068.82514: in VariableManager get_vars() 13830 1727204068.82523: Calling all_inventory to load vars for managed-node3 13830 1727204068.82526: Calling groups_inventory to load vars for managed-node3 13830 1727204068.82528: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204068.82531: Calling all_plugins_play to load vars for managed-node3 13830 1727204068.82533: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204068.82534: Calling groups_plugins_play to load vars for managed-node3 13830 1727204068.82623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204068.82732: done with get_vars() 13830 1727204068.82744: done queuing things up, now waiting for results queue to drain 13830 1727204068.82746: results queue empty 13830 1727204068.82746: checking for any_errors_fatal 13830 1727204068.82748: done checking for any_errors_fatal 13830 1727204068.82752: checking for max_fail_percentage 13830 1727204068.82753: done checking for max_fail_percentage 13830 1727204068.82753: checking to see if all hosts have failed and the running result is not ok 13830 1727204068.82753: done checking to see if all hosts have failed 13830 1727204068.82754: getting the remaining hosts for this loop 13830 1727204068.82755: done getting the remaining hosts for this loop 13830 1727204068.82756: getting the next task for host managed-node3 13830 1727204068.82759: done getting next task for host managed-node3 13830 1727204068.82761: ^ task is: TASK: Include the task 'el_repo_setup.yml' 13830 1727204068.82762: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204068.82765: getting variables 13830 1727204068.82765: in VariableManager get_vars() 13830 1727204068.82771: Calling all_inventory to load vars for managed-node3 13830 1727204068.82772: Calling groups_inventory to load vars for managed-node3 13830 1727204068.82773: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204068.82776: Calling all_plugins_play to load vars for managed-node3 13830 1727204068.82778: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204068.82779: Calling groups_plugins_play to load vars for managed-node3 13830 1727204068.82859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204068.82965: done with get_vars() 13830 1727204068.82972: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:11 Tuesday 24 September 2024 14:54:28 -0400 (0:00:01.887) 0:00:01.908 ***** 13830 1727204068.83024: entering _queue_task() for managed-node3/include_tasks 13830 1727204068.83025: Creating lock for include_tasks 13830 1727204068.83248: worker is 1 (out of 1 available) 13830 1727204068.83261: exiting _queue_task() for managed-node3/include_tasks 13830 1727204068.83274: done queuing things up, now waiting for results queue to drain 13830 1727204068.83275: waiting for pending results... 13830 1727204068.83410: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 13830 1727204068.83471: in run() - task 0affcd87-79f5-1659-6b02-000000000006 13830 1727204068.83480: variable 'ansible_search_path' from source: unknown 13830 1727204068.83512: calling self._execute() 13830 1727204068.83566: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204068.83570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204068.83578: variable 'omit' from source: magic vars 13830 1727204068.83652: _execute() done 13830 1727204068.83655: dumping result to json 13830 1727204068.83658: done dumping result, returning 13830 1727204068.83665: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-1659-6b02-000000000006] 13830 1727204068.83671: sending task result for task 0affcd87-79f5-1659-6b02-000000000006 13830 1727204068.83759: done sending task result for task 0affcd87-79f5-1659-6b02-000000000006 13830 1727204068.83762: WORKER PROCESS EXITING 13830 1727204068.83809: no more pending results, returning what we have 13830 1727204068.83814: in VariableManager get_vars() 13830 1727204068.83844: Calling all_inventory to load vars for managed-node3 13830 1727204068.83846: Calling groups_inventory to load vars for managed-node3 13830 1727204068.83849: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204068.83857: Calling all_plugins_play to load vars for managed-node3 13830 1727204068.83860: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204068.83862: Calling groups_plugins_play to load vars for managed-node3 13830 1727204068.84005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204068.84114: done with get_vars() 13830 1727204068.84119: variable 'ansible_search_path' from source: unknown 13830 1727204068.84129: we have included files to process 13830 1727204068.84130: generating all_blocks data 13830 1727204068.84131: done generating all_blocks data 13830 1727204068.84132: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13830 1727204068.84133: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13830 1727204068.84134: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13830 1727204068.84582: in VariableManager get_vars() 13830 1727204068.84593: done with get_vars() 13830 1727204068.84600: done processing included file 13830 1727204068.84601: iterating over new_blocks loaded from include file 13830 1727204068.84603: in VariableManager get_vars() 13830 1727204068.84608: done with get_vars() 13830 1727204068.84609: filtering new block on tags 13830 1727204068.84619: done filtering new block on tags 13830 1727204068.84620: in VariableManager get_vars() 13830 1727204068.84626: done with get_vars() 13830 1727204068.84627: filtering new block on tags 13830 1727204068.84638: done filtering new block on tags 13830 1727204068.84640: in VariableManager get_vars() 13830 1727204068.84649: done with get_vars() 13830 1727204068.84650: filtering new block on tags 13830 1727204068.84659: done filtering new block on tags 13830 1727204068.84660: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 13830 1727204068.84666: extending task lists for all hosts with included blocks 13830 1727204068.84696: done extending task lists 13830 1727204068.84696: done processing included files 13830 1727204068.84697: results queue empty 13830 1727204068.84697: checking for any_errors_fatal 13830 1727204068.84698: done checking for any_errors_fatal 13830 1727204068.84699: checking for max_fail_percentage 13830 1727204068.84699: done checking for max_fail_percentage 13830 1727204068.84700: checking to see if all hosts have failed and the running result is not ok 13830 1727204068.84700: done checking to see if all hosts have failed 13830 1727204068.84701: getting the remaining hosts for this loop 13830 1727204068.84701: done getting the remaining hosts for this loop 13830 1727204068.84703: getting the next task for host managed-node3 13830 1727204068.84706: done getting next task for host managed-node3 13830 1727204068.84707: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 13830 1727204068.84708: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204068.84709: getting variables 13830 1727204068.84710: in VariableManager get_vars() 13830 1727204068.84718: Calling all_inventory to load vars for managed-node3 13830 1727204068.84720: Calling groups_inventory to load vars for managed-node3 13830 1727204068.84721: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204068.84724: Calling all_plugins_play to load vars for managed-node3 13830 1727204068.84725: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204068.84727: Calling groups_plugins_play to load vars for managed-node3 13830 1727204068.84933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204068.85040: done with get_vars() 13830 1727204068.85046: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.020) 0:00:01.929 ***** 13830 1727204068.85097: entering _queue_task() for managed-node3/setup 13830 1727204068.85285: worker is 1 (out of 1 available) 13830 1727204068.85297: exiting _queue_task() for managed-node3/setup 13830 1727204068.85308: done queuing things up, now waiting for results queue to drain 13830 1727204068.85309: waiting for pending results... 13830 1727204068.85457: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 13830 1727204068.85517: in run() - task 0affcd87-79f5-1659-6b02-000000000026 13830 1727204068.85532: variable 'ansible_search_path' from source: unknown 13830 1727204068.85539: variable 'ansible_search_path' from source: unknown 13830 1727204068.85569: calling self._execute() 13830 1727204068.85620: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204068.85625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204068.85638: variable 'omit' from source: magic vars 13830 1727204068.85999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204068.89952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204068.90003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204068.90043: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204068.90079: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204068.90099: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204068.90160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204068.90182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204068.90199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204068.90225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204068.90239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204068.90370: variable 'ansible_facts' from source: unknown 13830 1727204068.90408: variable 'network_test_required_facts' from source: task vars 13830 1727204068.90434: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 13830 1727204068.90437: variable 'omit' from source: magic vars 13830 1727204068.90468: variable 'omit' from source: magic vars 13830 1727204068.90498: variable 'omit' from source: magic vars 13830 1727204068.90516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204068.90532: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204068.90548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204068.90558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204068.90568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204068.90589: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204068.90592: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204068.90597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204068.90656: Set connection var ansible_connection to ssh 13830 1727204068.90669: Set connection var ansible_timeout to 10 13830 1727204068.90673: Set connection var ansible_shell_executable to /bin/sh 13830 1727204068.90676: Set connection var ansible_shell_type to sh 13830 1727204068.90679: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204068.90693: Set connection var ansible_pipelining to False 13830 1727204068.90711: variable 'ansible_shell_executable' from source: unknown 13830 1727204068.90714: variable 'ansible_connection' from source: unknown 13830 1727204068.90717: variable 'ansible_module_compression' from source: unknown 13830 1727204068.90719: variable 'ansible_shell_type' from source: unknown 13830 1727204068.90721: variable 'ansible_shell_executable' from source: unknown 13830 1727204068.90723: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204068.90725: variable 'ansible_pipelining' from source: unknown 13830 1727204068.90729: variable 'ansible_timeout' from source: unknown 13830 1727204068.90735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204068.90836: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204068.90843: variable 'omit' from source: magic vars 13830 1727204068.90846: starting attempt loop 13830 1727204068.90849: running the handler 13830 1727204068.90858: _low_level_execute_command(): starting 13830 1727204068.90862: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204068.91371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.91387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.91403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204068.91416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.91428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.91476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204068.91488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204068.91545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204068.93375: stdout chunk (state=3): >>>/root <<< 13830 1727204068.93539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204068.93594: stderr chunk (state=3): >>><<< 13830 1727204068.93598: stdout chunk (state=3): >>><<< 13830 1727204068.93623: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204068.93637: _low_level_execute_command(): starting 13830 1727204068.93640: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390 `" && echo ansible-tmp-1727204068.936235-14018-277675688983390="` echo /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390 `" ) && sleep 0' 13830 1727204068.94131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.94135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.94174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204068.94178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.94181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.94235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204068.94241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204068.94242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204068.94292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204068.96273: stdout chunk (state=3): >>>ansible-tmp-1727204068.936235-14018-277675688983390=/root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390 <<< 13830 1727204068.96903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204068.96907: stdout chunk (state=3): >>><<< 13830 1727204068.96909: stderr chunk (state=3): >>><<< 13830 1727204068.96911: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204068.936235-14018-277675688983390=/root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204068.96914: variable 'ansible_module_compression' from source: unknown 13830 1727204068.96916: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13830 1727204068.96917: variable 'ansible_facts' from source: unknown 13830 1727204068.96919: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390/AnsiballZ_setup.py 13830 1727204068.96985: Sending initial data 13830 1727204068.96988: Sent initial data (153 bytes) 13830 1727204068.97961: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204068.97984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.98000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.98022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.98075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.98093: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204068.98108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.98125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204068.98137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204068.98148: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204068.98161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204068.98178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204068.98193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204068.98207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204068.98219: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204068.98232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204068.98312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204068.98336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204068.98352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204068.98423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204069.00095: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 13830 1727204069.00099: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204069.00144: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204069.00191: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpt_dinbif /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390/AnsiballZ_setup.py <<< 13830 1727204069.00226: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204069.02581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204069.02805: stderr chunk (state=3): >>><<< 13830 1727204069.02810: stdout chunk (state=3): >>><<< 13830 1727204069.02812: done transferring module to remote 13830 1727204069.02815: _low_level_execute_command(): starting 13830 1727204069.02817: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390/ /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390/AnsiballZ_setup.py && sleep 0' 13830 1727204069.03525: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204069.03544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.03575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.03596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.03641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.03653: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204069.03672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.03701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204069.03713: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204069.03724: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204069.03737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.03751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.03770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.03788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.03809: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204069.03824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.03915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204069.03940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204069.03956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204069.04044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204069.05792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204069.05886: stderr chunk (state=3): >>><<< 13830 1727204069.05889: stdout chunk (state=3): >>><<< 13830 1727204069.05992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204069.05995: _low_level_execute_command(): starting 13830 1727204069.05998: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390/AnsiballZ_setup.py && sleep 0' 13830 1727204069.07322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.07327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.07330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.07483: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204069.07493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.07507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204069.07514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204069.07521: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204069.07530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.07542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.07553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.07560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.07568: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204069.07578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.07653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204069.07675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204069.08381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204069.08469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204069.10388: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 13830 1727204069.10441: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 13830 1727204069.10444: stdout chunk (state=3): >>>import '_weakref' # <<< 13830 1727204069.10505: stdout chunk (state=3): >>>import '_io' # <<< 13830 1727204069.10509: stdout chunk (state=3): >>>import 'marshal' # <<< 13830 1727204069.10546: stdout chunk (state=3): >>>import 'posix' # <<< 13830 1727204069.10569: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13830 1727204069.10607: stdout chunk (state=3): >>>import 'time' # <<< 13830 1727204069.10625: stdout chunk (state=3): >>>import 'zipimport' # <<< 13830 1727204069.10629: stdout chunk (state=3): >>> # installed zipimport hook <<< 13830 1727204069.10673: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204069.10706: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 13830 1727204069.10728: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ff3dc0> <<< 13830 1727204069.10810: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 13830 1727204069.10818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ff3b20> <<< 13830 1727204069.10846: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ff3ac0> <<< 13830 1727204069.10892: stdout chunk (state=3): >>>import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98490> <<< 13830 1727204069.10936: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 13830 1727204069.10955: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 13830 1727204069.10968: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98670> <<< 13830 1727204069.11008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 13830 1727204069.11034: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13830 1727204069.11063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 13830 1727204069.11104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13830 1727204069.11126: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13830 1727204069.11146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13830 1727204069.11205: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f4f220> <<< 13830 1727204069.11253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 13830 1727204069.11272: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f4f940> <<< 13830 1727204069.11304: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188fb0880> <<< 13830 1727204069.11326: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f48d90> <<< 13830 1727204069.11386: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f72d90> <<< 13830 1727204069.11444: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98970> <<< 13830 1727204069.11467: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13830 1727204069.11801: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 13830 1727204069.11816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 13830 1727204069.11839: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 13830 1727204069.11853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 13830 1727204069.12487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 13830 1727204069.12535: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188eeff10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ef40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ee75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188eee6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188eef3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188b92eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b929a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b92fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b92df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2160> import '_collections' # <<< 13830 1727204069.12614: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ec9e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ec1700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ed5760> <<< 13830 1727204069.12631: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ef5eb0> <<< 13830 1727204069.12635: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13830 1727204069.12672: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ba2d60> <<< 13830 1727204069.12680: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ec9340> <<< 13830 1727204069.12709: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.12722: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ed5370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188efba60> <<< 13830 1727204069.12746: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 13830 1727204069.12754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 13830 1727204069.12780: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204069.12801: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 13830 1727204069.12806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 13830 1727204069.12826: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2e80> <<< 13830 1727204069.12857: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2df0> <<< 13830 1727204069.12904: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13830 1727204069.12915: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13830 1727204069.13016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13830 1727204069.13059: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py <<< 13830 1727204069.13075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc'<<< 13830 1727204069.13121: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b76460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13830 1727204069.13160: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b76550> <<< 13830 1727204069.13338: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b540d0> <<< 13830 1727204069.13386: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba5b20><<< 13830 1727204069.13394: stdout chunk (state=3): >>> <<< 13830 1727204069.13424: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba54c0> <<< 13830 1727204069.13462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 13830 1727204069.13495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc'<<< 13830 1727204069.13502: stdout chunk (state=3): >>> <<< 13830 1727204069.13549: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 13830 1727204069.13580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc'<<< 13830 1727204069.13584: stdout chunk (state=3): >>> <<< 13830 1727204069.13623: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 13830 1727204069.13647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 13830 1727204069.13677: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ac42b0> <<< 13830 1727204069.13732: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b61d60> <<< 13830 1727204069.13821: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba5fa0> <<< 13830 1727204069.13845: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188efb0d0> <<< 13830 1727204069.13886: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py<<< 13830 1727204069.13891: stdout chunk (state=3): >>> <<< 13830 1727204069.13944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 13830 1727204069.13972: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 13830 1727204069.13994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc'<<< 13830 1727204069.13999: stdout chunk (state=3): >>> <<< 13830 1727204069.14022: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ad4be0> <<< 13830 1727204069.14051: stdout chunk (state=3): >>>import 'errno' # <<< 13830 1727204069.14056: stdout chunk (state=3): >>> <<< 13830 1727204069.14096: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.14123: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.14135: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ad4f10><<< 13830 1727204069.14142: stdout chunk (state=3): >>> <<< 13830 1727204069.14179: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 13830 1727204069.14202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13830 1727204069.14238: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py<<< 13830 1727204069.14262: stdout chunk (state=3): >>> <<< 13830 1727204069.14270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 13830 1727204069.14294: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ae7820> <<< 13830 1727204069.14334: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13830 1727204069.14387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc'<<< 13830 1727204069.14390: stdout chunk (state=3): >>> <<< 13830 1727204069.14440: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ae7d60><<< 13830 1727204069.14444: stdout chunk (state=3): >>> <<< 13830 1727204069.14495: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.14523: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.14537: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188a80490> <<< 13830 1727204069.14557: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ad4f40> <<< 13830 1727204069.14596: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 13830 1727204069.14621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13830 1727204069.14673: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204069.14686: stdout chunk (state=3): >>> <<< 13830 1727204069.14703: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.14712: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188a90370> <<< 13830 1727204069.14737: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ae76a0> <<< 13830 1727204069.14762: stdout chunk (state=3): >>>import 'pwd' # <<< 13830 1727204069.14818: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188a90430> <<< 13830 1727204069.14856: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2ac0> <<< 13830 1727204069.14884: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13830 1727204069.14907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 13830 1727204069.14946: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py<<< 13830 1727204069.14976: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13830 1727204069.14982: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.14992: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aac790> <<< 13830 1727204069.15027: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13830 1727204069.15218: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aaca60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aac850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aac940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13830 1727204069.15497: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aacd90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ab62e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aac9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aa0b20> <<< 13830 1727204069.15598: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba26a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13830 1727204069.15682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aacb80> <<< 13830 1727204069.15735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f51889da760> <<< 13830 1727204069.16026: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip' # zipimport: zlib available <<< 13830 1727204069.16111: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.16181: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 13830 1727204069.16198: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.17783: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.19320: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec8b0> <<< 13830 1727204069.19376: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204069.19381: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 13830 1727204069.19419: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13830 1727204069.19423: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883ec160> <<< 13830 1727204069.19455: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec280> <<< 13830 1727204069.19501: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec5e0> <<< 13830 1727204069.19519: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 13830 1727204069.19567: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ece20> <<< 13830 1727204069.19570: stdout chunk (state=3): >>>import 'atexit' # <<< 13830 1727204069.19595: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883ec580> <<< 13830 1727204069.19619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 13830 1727204069.19640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13830 1727204069.19707: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec100> <<< 13830 1727204069.19711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 13830 1727204069.19738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13830 1727204069.19751: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13830 1727204069.19776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 13830 1727204069.19790: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13830 1727204069.19867: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188381040> <<< 13830 1727204069.19903: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51882c93d0> <<< 13830 1727204069.19941: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51882c90d0> <<< 13830 1727204069.19970: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13830 1727204069.20023: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51882c9d30> <<< 13830 1727204069.20026: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883d3d90> <<< 13830 1727204069.20186: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883d33a0> <<< 13830 1727204069.20765: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883d3f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51889daa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883aadc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883aa490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883e9a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883aa5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883aa5e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188334f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51889602e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 13830 1727204069.20897: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883317f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188960460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 13830 1727204069.20922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 13830 1727204069.21009: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188960c40> <<< 13830 1727204069.21808: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188331790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188960130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188960670> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188960730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51889589a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883278e0> <<< 13830 1727204069.21827: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.21845: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188344c70> <<< 13830 1727204069.21869: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188330520> <<< 13830 1727204069.21908: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.21912: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.21917: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188327e80> <<< 13830 1727204069.21934: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188330940> <<< 13830 1727204069.21953: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.21980: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.21984: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 13830 1727204069.22016: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22124: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22241: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22254: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22274: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 13830 1727204069.22294: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22318: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22325: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 13830 1727204069.22349: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22504: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.22659: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.23429: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.24184: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 13830 1727204069.24193: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 13830 1727204069.24221: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 13830 1727204069.24226: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 13830 1727204069.24255: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 13830 1727204069.24276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204069.24352: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.24363: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188341790> <<< 13830 1727204069.24447: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 13830 1727204069.24467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 13830 1727204069.24476: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518837e850> <<< 13830 1727204069.24497: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ee5fa0> <<< 13830 1727204069.24553: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 13830 1727204069.24578: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.24599: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.24626: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 13830 1727204069.24640: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.24830: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.25024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 13830 1727204069.25042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13830 1727204069.25080: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883b2310> <<< 13830 1727204069.25098: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.25735: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26343: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26428: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26522: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 13830 1727204069.26538: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26588: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26629: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 13830 1727204069.26649: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26734: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26839: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 13830 1727204069.26855: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26880: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26889: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 13830 1727204069.26915: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.26963: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.27018: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 13830 1727204069.27033: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.27335: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.27629: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13830 1727204069.27664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 13830 1727204069.27672: stdout chunk (state=3): >>>import '_ast' # <<< 13830 1727204069.27772: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883f2ca0> <<< 13830 1727204069.27775: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.27858: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.27950: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 13830 1727204069.27953: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 13830 1727204069.27958: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 13830 1727204069.27981: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28025: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28064: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 13830 1727204069.28081: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28126: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28180: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28291: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28379: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13830 1727204069.28414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204069.28509: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188363c70> <<< 13830 1727204069.28652: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883f2bb0> <<< 13830 1727204069.28694: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/file.py <<< 13830 1727204069.28698: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 13830 1727204069.28776: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28850: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28880: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.28925: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 13830 1727204069.28944: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 13830 1727204069.28968: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 13830 1727204069.29010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 13830 1727204069.29033: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 13830 1727204069.29060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13830 1727204069.29185: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518837fd60> <<< 13830 1727204069.29240: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883c0b80> <<< 13830 1727204069.29324: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187d57eb0> <<< 13830 1727204069.29327: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 13830 1727204069.29359: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29384: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 13830 1727204069.29483: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 13830 1727204069.29493: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29518: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 13830 1727204069.29532: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29601: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29677: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29697: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29724: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29761: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29814: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29849: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29890: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 13830 1727204069.29899: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.29996: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.30089: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.30113: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.30156: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 13830 1727204069.30166: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.30378: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.30600: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.30635: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.30724: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204069.30727: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 13830 1727204069.30753: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 13830 1727204069.30758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 13830 1727204069.30797: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187c42100> <<< 13830 1727204069.30827: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 13830 1727204069.30846: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 13830 1727204069.30882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 13830 1727204069.30907: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 13830 1727204069.30928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 13830 1727204069.30938: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ea6a60> <<< 13830 1727204069.30973: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187ea69d0> <<< 13830 1727204069.31061: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e79c70> <<< 13830 1727204069.31073: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e79c10> <<< 13830 1727204069.31107: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ec5460> <<< 13830 1727204069.31112: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ec53d0> <<< 13830 1727204069.31137: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 13830 1727204069.31158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 13830 1727204069.31180: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 13830 1727204069.31185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 13830 1727204069.31217: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187e89310> <<< 13830 1727204069.31244: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e899a0> <<< 13830 1727204069.31261: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 13830 1727204069.31273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 13830 1727204069.31292: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e89940> <<< 13830 1727204069.31317: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 13830 1727204069.31338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 13830 1727204069.31378: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187ca40d0> <<< 13830 1727204069.31409: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188969c40> <<< 13830 1727204069.31438: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ec5790> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 13830 1727204069.31450: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 13830 1727204069.31474: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31478: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 13830 1727204069.31499: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31554: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31623: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 13830 1727204069.31630: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31691: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31747: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 13830 1727204069.31752: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31763: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31777: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 13830 1727204069.31815: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31845: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 13830 1727204069.31852: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31904: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.31958: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 13830 1727204069.31963: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.32018: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.32059: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 13830 1727204069.32068: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.32139: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.32208: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.32271: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.32349: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 13830 1727204069.32359: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.32993: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33575: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 13830 1727204069.33580: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33648: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33710: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33752: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33783: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 13830 1727204069.33791: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 13830 1727204069.33827: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33859: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 13830 1727204069.33870: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33927: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.33995: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 13830 1727204069.34000: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34035: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34067: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 13830 1727204069.34080: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34103: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34142: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 13830 1727204069.34147: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34235: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34333: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 13830 1727204069.34339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 13830 1727204069.34363: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187b95f10> <<< 13830 1727204069.34394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 13830 1727204069.34420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 13830 1727204069.34665: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187b959d0> <<< 13830 1727204069.34678: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 13830 1727204069.34681: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34751: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34837: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 13830 1727204069.34842: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.34951: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.35067: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 13830 1727204069.35072: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.35146: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.35243: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 13830 1727204069.35248: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.35296: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.35343: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 13830 1727204069.35380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 13830 1727204069.35577: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.35583: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187bbdc10> <<< 13830 1727204069.35957: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187c06c40> <<< 13830 1727204069.35963: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 13830 1727204069.36034: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36090: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 13830 1727204069.36098: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36197: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36301: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36439: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36628: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/version.py <<< 13830 1727204069.36645: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 13830 1727204069.36649: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36683: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36730: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 13830 1727204069.36737: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36784: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36839: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 13830 1727204069.36844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 13830 1727204069.36899: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.36909: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187c085e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187c08790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 13830 1727204069.36913: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36931: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36934: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 13830 1727204069.36939: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.36985: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37030: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 13830 1727204069.37033: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37229: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37427: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 13830 1727204069.37437: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37550: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37669: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37714: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37765: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 13830 1727204069.37770: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 13830 1727204069.37882: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.37904: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.38080: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.38255: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 13830 1727204069.38267: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 13830 1727204069.38423: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.38575: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 13830 1727204069.38581: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.38619: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.38657: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.39350: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40019: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 13830 1727204069.40032: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40150: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40282: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 13830 1727204069.40288: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40409: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40538: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 13830 1727204069.40546: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40726: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40921: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 13830 1727204069.40937: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.40941: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 13830 1727204069.40970: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41005: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41054: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 13830 1727204069.41059: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41185: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41303: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41573: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41840: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 13830 1727204069.41843: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available <<< 13830 1727204069.41883: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41937: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 13830 1727204069.41952: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41966: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.41996: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 13830 1727204069.42002: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42077: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42157: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 13830 1727204069.42166: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42182: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42210: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 13830 1727204069.42216: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42283: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42349: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 13830 1727204069.42354: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42417: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42489: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 13830 1727204069.42497: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.42833: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43175: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 13830 1727204069.43181: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43244: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43311: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 13830 1727204069.43318: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43355: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43389: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 13830 1727204069.43403: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43433: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43472: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 13830 1727204069.43477: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43514: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43553: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 13830 1727204069.43561: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43671: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43754: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 13830 1727204069.43759: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43785: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 13830 1727204069.43802: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43839: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43895: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 13830 1727204069.43907: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43921: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43940: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.43995: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44048: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44134: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44227: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 13830 1727204069.44240: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 13830 1727204069.44248: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44300: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44357: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 13830 1727204069.44360: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44625: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44881: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 13830 1727204069.44888: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44938: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.44998: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 13830 1727204069.45001: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.45053: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.45109: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 13830 1727204069.45117: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.45209: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.45313: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 13830 1727204069.45326: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.45430: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.45540: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py <<< 13830 1727204069.45548: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 13830 1727204069.45670: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204069.46884: stdout chunk (state=3): >>>import 'gc' # <<< 13830 1727204069.47327: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 13830 1727204069.47352: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 13830 1727204069.47370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 13830 1727204069.47426: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204069.47430: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51879a8790> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518797ad60> <<< 13830 1727204069.47506: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518797a6a0> <<< 13830 1727204069.48023: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5<<< 13830 1727204069.48069: stdout chunk (state=3): >>>vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "29", "epoch": "1727204069", "epoch_int": "1727204069", "date": "2024-09-24", "time": "14:54:29", "iso8601_micro": "2024-09-24T18:54:29.478560Z", "iso8601": "2024-09-24T18:54:29Z", "iso8601_basic": "20240924T145429478560", "iso8601_basic_short": "20240924T145429", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13830 1727204069.48541: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins<<< 13830 1727204069.48573: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil <<< 13830 1727204069.48585: stdout chunk (state=3): >>># cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils <<< 13830 1727204069.48595: stdout chunk (state=3): >>># destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 13830 1727204069.48635: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 13830 1727204069.48650: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns <<< 13830 1727204069.48654: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd <<< 13830 1727204069.49116: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux <<< 13830 1727204069.49123: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 13830 1727204069.49188: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse <<< 13830 1727204069.49237: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 13830 1727204069.49245: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 13830 1727204069.49274: stdout chunk (state=3): >>># destroy queue <<< 13830 1727204069.49278: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 13830 1727204069.49397: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 <<< 13830 1727204069.49402: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 13830 1727204069.49407: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 13830 1727204069.49478: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess <<< 13830 1727204069.49502: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 13830 1727204069.49525: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 13830 1727204069.49534: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 13830 1727204069.49550: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 13830 1727204069.49580: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 13830 1727204069.49610: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 13830 1727204069.49613: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 13830 1727204069.49648: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios <<< 13830 1727204069.49657: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 13830 1727204069.49854: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 13830 1727204069.49861: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 13830 1727204069.49891: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 13830 1727204069.49900: stdout chunk (state=3): >>># destroy stat <<< 13830 1727204069.49929: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 13830 1727204069.49939: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 13830 1727204069.49942: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 13830 1727204069.49947: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 13830 1727204069.50270: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 13830 1727204069.50530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204069.50592: stderr chunk (state=3): >>><<< 13830 1727204069.50595: stdout chunk (state=3): >>><<< 13830 1727204069.51140: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ff3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ff3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ff3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f4f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f4f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f72850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f4f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188fb0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f48d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f72d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188f98970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188eeff10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ef40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ee75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188eee6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188eef3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188b92eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b929a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b92fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b92df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ec9e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ec1700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ed5760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ef5eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ba2d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ec9340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ed5370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188efba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b76460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b76550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b540d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba5b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba54c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ac42b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188b61d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba5fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188efb0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ad4be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ad4f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ae7820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ae7d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188a80490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ad4f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188a90370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ae76a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188a90430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba2ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aac790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aaca60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aac850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aac940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188aacd90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188ab62e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aac9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aa0b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188ba26a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188aacb80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f51889da760> # zipimport: found 103 names in '/tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883ec160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ece20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883ec580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883ec100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188381040> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51882c93d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51882c90d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51882c9d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883d3d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883d33a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883d3f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51889daa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883aadc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883aa490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883e9a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883aa5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883aa5e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188334f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51889602e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883317f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188960460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188960c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188331790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188960130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188960670> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188960730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51889589a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51883278e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188344c70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188330520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188327e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188330940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188341790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518837e850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ee5fa0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883b2310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883f2ca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5188363c70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883f2bb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518837fd60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51883c0b80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187d57eb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187c42100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ea6a60> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187ea69d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e79c70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e79c10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ec5460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ec53d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187e89310> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e899a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187e89940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187ca40d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5188969c40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187ec5790> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187b95f10> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187b959d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187bbdc10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187c06c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5187c085e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5187c08790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_4gzp0nq3/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51879a8790> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518797ad60> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f518797a6a0> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "29", "epoch": "1727204069", "epoch_int": "1727204069", "date": "2024-09-24", "time": "14:54:29", "iso8601_micro": "2024-09-24T18:54:29.478560Z", "iso8601": "2024-09-24T18:54:29Z", "iso8601_basic": "20240924T145429478560", "iso8601_basic_short": "20240924T145429", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 13830 1727204069.52486: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204069.52490: _low_level_execute_command(): starting 13830 1727204069.52492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204068.936235-14018-277675688983390/ > /dev/null 2>&1 && sleep 0' 13830 1727204069.52494: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204069.52496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.52514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.52537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.52586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.52596: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204069.52609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.52624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204069.52638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204069.52648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204069.52658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.52671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.52686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.52696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.52704: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204069.52715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.52798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204069.52816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204069.52833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204069.52912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204069.54694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204069.54718: stderr chunk (state=3): >>><<< 13830 1727204069.54722: stdout chunk (state=3): >>><<< 13830 1727204069.54735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204069.54741: handler run complete 13830 1727204069.54784: variable 'ansible_facts' from source: unknown 13830 1727204069.54821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204069.54898: variable 'ansible_facts' from source: unknown 13830 1727204069.54931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204069.54963: attempt loop complete, returning result 13830 1727204069.54968: _execute() done 13830 1727204069.54971: dumping result to json 13830 1727204069.54980: done dumping result, returning 13830 1727204069.54991: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-1659-6b02-000000000026] 13830 1727204069.54995: sending task result for task 0affcd87-79f5-1659-6b02-000000000026 13830 1727204069.55126: done sending task result for task 0affcd87-79f5-1659-6b02-000000000026 13830 1727204069.55131: WORKER PROCESS EXITING ok: [managed-node3] 13830 1727204069.55236: no more pending results, returning what we have 13830 1727204069.55241: results queue empty 13830 1727204069.55242: checking for any_errors_fatal 13830 1727204069.55243: done checking for any_errors_fatal 13830 1727204069.55244: checking for max_fail_percentage 13830 1727204069.55246: done checking for max_fail_percentage 13830 1727204069.55246: checking to see if all hosts have failed and the running result is not ok 13830 1727204069.55247: done checking to see if all hosts have failed 13830 1727204069.55248: getting the remaining hosts for this loop 13830 1727204069.55249: done getting the remaining hosts for this loop 13830 1727204069.55252: getting the next task for host managed-node3 13830 1727204069.55259: done getting next task for host managed-node3 13830 1727204069.55262: ^ task is: TASK: Check if system is ostree 13830 1727204069.55265: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204069.55268: getting variables 13830 1727204069.55270: in VariableManager get_vars() 13830 1727204069.55295: Calling all_inventory to load vars for managed-node3 13830 1727204069.55298: Calling groups_inventory to load vars for managed-node3 13830 1727204069.55300: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204069.55310: Calling all_plugins_play to load vars for managed-node3 13830 1727204069.55312: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204069.55314: Calling groups_plugins_play to load vars for managed-node3 13830 1727204069.59793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204069.59977: done with get_vars() 13830 1727204069.59990: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:54:29 -0400 (0:00:00.750) 0:00:02.679 ***** 13830 1727204069.60118: entering _queue_task() for managed-node3/stat 13830 1727204069.60422: worker is 1 (out of 1 available) 13830 1727204069.60445: exiting _queue_task() for managed-node3/stat 13830 1727204069.60457: done queuing things up, now waiting for results queue to drain 13830 1727204069.60459: waiting for pending results... 13830 1727204069.60740: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 13830 1727204069.60845: in run() - task 0affcd87-79f5-1659-6b02-000000000028 13830 1727204069.60856: variable 'ansible_search_path' from source: unknown 13830 1727204069.60859: variable 'ansible_search_path' from source: unknown 13830 1727204069.60910: calling self._execute() 13830 1727204069.61018: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204069.61025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204069.61037: variable 'omit' from source: magic vars 13830 1727204069.62919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204069.63612: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204069.64391: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204069.64446: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204069.64558: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204069.64723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204069.64885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204069.64917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204069.64994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204069.65324: Evaluated conditional (not __network_is_ostree is defined): True 13830 1727204069.65336: variable 'omit' from source: magic vars 13830 1727204069.65382: variable 'omit' from source: magic vars 13830 1727204069.65551: variable 'omit' from source: magic vars 13830 1727204069.65585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204069.65646: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204069.65746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204069.65770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204069.65844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204069.65881: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204069.65947: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204069.65956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204069.66184: Set connection var ansible_connection to ssh 13830 1727204069.66200: Set connection var ansible_timeout to 10 13830 1727204069.66211: Set connection var ansible_shell_executable to /bin/sh 13830 1727204069.66218: Set connection var ansible_shell_type to sh 13830 1727204069.66228: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204069.66281: Set connection var ansible_pipelining to False 13830 1727204069.66311: variable 'ansible_shell_executable' from source: unknown 13830 1727204069.66380: variable 'ansible_connection' from source: unknown 13830 1727204069.66400: variable 'ansible_module_compression' from source: unknown 13830 1727204069.66408: variable 'ansible_shell_type' from source: unknown 13830 1727204069.66415: variable 'ansible_shell_executable' from source: unknown 13830 1727204069.66422: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204069.66430: variable 'ansible_pipelining' from source: unknown 13830 1727204069.66437: variable 'ansible_timeout' from source: unknown 13830 1727204069.66446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204069.66766: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204069.66840: variable 'omit' from source: magic vars 13830 1727204069.66927: starting attempt loop 13830 1727204069.66942: running the handler 13830 1727204069.66959: _low_level_execute_command(): starting 13830 1727204069.66973: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204069.67997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.68025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.68045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.68110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204069.68133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204069.68240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204069.69698: stdout chunk (state=3): >>>/root <<< 13830 1727204069.69908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204069.69916: stdout chunk (state=3): >>><<< 13830 1727204069.69919: stderr chunk (state=3): >>><<< 13830 1727204069.70037: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204069.70050: _low_level_execute_command(): starting 13830 1727204069.70054: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011 `" && echo ansible-tmp-1727204069.6994636-14047-275745195108011="` echo /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011 `" ) && sleep 0' 13830 1727204069.70872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204069.71028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.71044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.71071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.71115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.71132: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204069.71147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.71168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204069.71181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204069.71193: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204069.71205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204069.71221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204069.71240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204069.71254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204069.71269: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204069.71284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204069.71475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204069.71500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204069.71518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204069.71594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204069.73441: stdout chunk (state=3): >>>ansible-tmp-1727204069.6994636-14047-275745195108011=/root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011 <<< 13830 1727204069.73643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204069.73647: stdout chunk (state=3): >>><<< 13830 1727204069.73649: stderr chunk (state=3): >>><<< 13830 1727204069.73972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204069.6994636-14047-275745195108011=/root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204069.73976: variable 'ansible_module_compression' from source: unknown 13830 1727204069.73978: ANSIBALLZ: Using lock for stat 13830 1727204069.73981: ANSIBALLZ: Acquiring lock 13830 1727204069.73983: ANSIBALLZ: Lock acquired: 140043657886512 13830 1727204069.73985: ANSIBALLZ: Creating module 13830 1727204069.98987: ANSIBALLZ: Writing module into payload 13830 1727204069.99125: ANSIBALLZ: Writing module 13830 1727204069.99153: ANSIBALLZ: Renaming module 13830 1727204069.99165: ANSIBALLZ: Done creating module 13830 1727204069.99187: variable 'ansible_facts' from source: unknown 13830 1727204069.99273: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011/AnsiballZ_stat.py 13830 1727204069.99439: Sending initial data 13830 1727204069.99442: Sent initial data (153 bytes) 13830 1727204070.00512: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204070.00529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.00544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.00569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.00615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.00627: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204070.00643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.00662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204070.00682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204070.00694: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204070.00707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.00722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.00739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.00751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.00762: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204070.00777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.00857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204070.00880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.00903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.00982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.03520: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204070.03568: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204070.03613: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpkeue0hxg /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011/AnsiballZ_stat.py <<< 13830 1727204070.03630: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204070.04776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204070.05076: stderr chunk (state=3): >>><<< 13830 1727204070.05080: stdout chunk (state=3): >>><<< 13830 1727204070.05082: done transferring module to remote 13830 1727204070.05084: _low_level_execute_command(): starting 13830 1727204070.05086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011/ /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011/AnsiballZ_stat.py && sleep 0' 13830 1727204070.05757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204070.05779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.05794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.05814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.05871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.05887: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204070.05902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.05921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204070.05935: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204070.05947: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204070.05958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.05986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.06003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.06017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.06028: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204070.06043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.06136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204070.06159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.06178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.06268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.08802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204070.08914: stderr chunk (state=3): >>><<< 13830 1727204070.08938: stdout chunk (state=3): >>><<< 13830 1727204070.09162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204070.09166: _low_level_execute_command(): starting 13830 1727204070.09168: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011/AnsiballZ_stat.py && sleep 0' 13830 1727204070.10528: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204070.10540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.10552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.10594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.10609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.10616: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204070.10627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.10646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204070.10724: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204070.10728: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204070.10730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.10732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.10735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.10739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.10741: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204070.10743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.10934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204070.10962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.10984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.11076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.13815: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 13830 1727204070.13877: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 13830 1727204070.13894: stdout chunk (state=3): >>> import '_weakref' # <<< 13830 1727204070.13985: stdout chunk (state=3): >>>import '_io' # <<< 13830 1727204070.14007: stdout chunk (state=3): >>>import 'marshal' # <<< 13830 1727204070.14068: stdout chunk (state=3): >>>import 'posix' # <<< 13830 1727204070.14123: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 13830 1727204070.14152: stdout chunk (state=3): >>># installing zipimport hook <<< 13830 1727204070.14200: stdout chunk (state=3): >>>import 'time' # <<< 13830 1727204070.14214: stdout chunk (state=3): >>> <<< 13830 1727204070.14240: stdout chunk (state=3): >>>import 'zipimport' # <<< 13830 1727204070.14244: stdout chunk (state=3): >>># installed zipimport hook <<< 13830 1727204070.14340: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py<<< 13830 1727204070.14344: stdout chunk (state=3): >>> <<< 13830 1727204070.14347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc'<<< 13830 1727204070.14361: stdout chunk (state=3): >>> <<< 13830 1727204070.14387: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py<<< 13830 1727204070.14390: stdout chunk (state=3): >>> <<< 13830 1727204070.14435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc'<<< 13830 1727204070.14443: stdout chunk (state=3): >>> <<< 13830 1727204070.14460: stdout chunk (state=3): >>>import '_codecs' # <<< 13830 1727204070.14504: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13f3dc0> <<< 13830 1727204070.14583: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py<<< 13830 1727204070.14587: stdout chunk (state=3): >>> <<< 13830 1727204070.14621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 13830 1727204070.14627: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13983a0><<< 13830 1727204070.14634: stdout chunk (state=3): >>> <<< 13830 1727204070.14649: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13f3b20> <<< 13830 1727204070.14700: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 13830 1727204070.14703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc'<<< 13830 1727204070.14718: stdout chunk (state=3): >>> <<< 13830 1727204070.14747: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13f3ac0><<< 13830 1727204070.14750: stdout chunk (state=3): >>> <<< 13830 1727204070.14785: stdout chunk (state=3): >>>import '_signal' # <<< 13830 1727204070.14838: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 13830 1727204070.14841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 13830 1727204070.14877: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398490> <<< 13830 1727204070.14926: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 13830 1727204070.14932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc'<<< 13830 1727204070.14949: stdout chunk (state=3): >>> <<< 13830 1727204070.14973: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py<<< 13830 1727204070.14996: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc'<<< 13830 1727204070.15002: stdout chunk (state=3): >>> <<< 13830 1727204070.15032: stdout chunk (state=3): >>>import '_abc' # <<< 13830 1727204070.15036: stdout chunk (state=3): >>> <<< 13830 1727204070.15053: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398940> <<< 13830 1727204070.15092: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398670> <<< 13830 1727204070.15140: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 13830 1727204070.15178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 13830 1727204070.15219: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 13830 1727204070.15251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 13830 1727204070.15293: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 13830 1727204070.15335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 13830 1727204070.15384: stdout chunk (state=3): >>>import '_stat' # <<< 13830 1727204070.15418: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca134f190><<< 13830 1727204070.15422: stdout chunk (state=3): >>> <<< 13830 1727204070.15441: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 13830 1727204070.15469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 13830 1727204070.15587: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca134f220> <<< 13830 1727204070.15632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 13830 1727204070.15667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 13830 1727204070.15702: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py<<< 13830 1727204070.15754: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1372850> <<< 13830 1727204070.15775: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca134f940> <<< 13830 1727204070.15814: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13b0880><<< 13830 1727204070.15844: stdout chunk (state=3): >>> <<< 13830 1727204070.15892: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc'<<< 13830 1727204070.15904: stdout chunk (state=3): >>> import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1348d90> <<< 13830 1727204070.16105: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc'<<< 13830 1727204070.16148: stdout chunk (state=3): >>> import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1372d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398970> <<< 13830 1727204070.16246: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13830 1727204070.16898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12edf10> <<< 13830 1727204070.16912: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12f40a0> <<< 13830 1727204070.16952: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 13830 1727204070.16994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 13830 1727204070.17022: stdout chunk (state=3): >>>import '_sre' # <<< 13830 1727204070.17059: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py<<< 13830 1727204070.17079: stdout chunk (state=3): >>> <<< 13830 1727204070.17125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 13830 1727204070.17144: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 13830 1727204070.17166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 13830 1727204070.17236: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12e75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12ee6a0><<< 13830 1727204070.17261: stdout chunk (state=3): >>> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12ed3d0><<< 13830 1727204070.17274: stdout chunk (state=3): >>> <<< 13830 1727204070.17303: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 13830 1727204070.17405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc'<<< 13830 1727204070.17416: stdout chunk (state=3): >>> <<< 13830 1727204070.17456: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 13830 1727204070.17503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc'<<< 13830 1727204070.17513: stdout chunk (state=3): >>> <<< 13830 1727204070.17563: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 13830 1727204070.17578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 13830 1727204070.17634: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.17672: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca1271eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12719a0><<< 13830 1727204070.17703: stdout chunk (state=3): >>> import 'itertools' # <<< 13830 1727204070.17756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 13830 1727204070.17785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1271fa0><<< 13830 1727204070.17814: stdout chunk (state=3): >>> <<< 13830 1727204070.17826: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 13830 1727204070.17850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 13830 1727204070.17899: stdout chunk (state=3): >>>import '_operator' # <<< 13830 1727204070.17913: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1271df0> <<< 13830 1727204070.17953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 13830 1727204070.17984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 13830 1727204070.17994: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281160> <<< 13830 1727204070.18030: stdout chunk (state=3): >>>import '_collections' # <<< 13830 1727204070.18092: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12c9e20><<< 13830 1727204070.18122: stdout chunk (state=3): >>> <<< 13830 1727204070.18139: stdout chunk (state=3): >>>import '_functools' # <<< 13830 1727204070.18170: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12c1700> <<< 13830 1727204070.18275: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc'<<< 13830 1727204070.18311: stdout chunk (state=3): >>> import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12d5760> <<< 13830 1727204070.18324: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12f5eb0> <<< 13830 1727204070.18371: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 13830 1727204070.18443: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.18486: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca1281d60> <<< 13830 1727204070.18499: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12c9340> <<< 13830 1727204070.18554: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.18590: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca12d5370> <<< 13830 1727204070.18618: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12fba60> <<< 13830 1727204070.18649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 13830 1727204070.18665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 13830 1727204070.18720: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204070.18752: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 13830 1727204070.18785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 13830 1727204070.18828: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281e80> <<< 13830 1727204070.18880: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 13830 1727204070.18917: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281df0> <<< 13830 1727204070.18959: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 13830 1727204070.18983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 13830 1727204070.19009: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 13830 1727204070.19044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 13830 1727204070.19082: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 13830 1727204070.19168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 13830 1727204070.19215: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py<<< 13830 1727204070.19241: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1255460> <<< 13830 1727204070.19284: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 13830 1727204070.19316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 13830 1727204070.19388: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1255550> <<< 13830 1727204070.19578: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12330d0> <<< 13830 1727204070.19808: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1284b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12844c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f6e2b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1240d60> <<< 13830 1727204070.19862: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1284fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12fb0d0> <<< 13830 1727204070.19891: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 13830 1727204070.19911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 13830 1727204070.19953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f7ebe0> import 'errno' # <<< 13830 1727204070.19985: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f7ef10> <<< 13830 1727204070.20016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 13830 1727204070.20070: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f91820> <<< 13830 1727204070.20082: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 13830 1727204070.20119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 13830 1727204070.20149: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f91d60> <<< 13830 1727204070.20189: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f1f490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f7ef40> <<< 13830 1727204070.20216: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 13830 1727204070.20287: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f2f370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f916a0> import 'pwd' # <<< 13830 1727204070.20313: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f2f430> <<< 13830 1727204070.20353: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281ac0> <<< 13830 1727204070.20391: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 13830 1727204070.20422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 13830 1727204070.20434: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 13830 1727204070.20471: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4b790> <<< 13830 1727204070.20520: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 13830 1727204070.20546: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4ba60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f4b850> <<< 13830 1727204070.20561: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4b940> <<< 13830 1727204070.20587: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 13830 1727204070.20869: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4bd90> <<< 13830 1727204070.20873: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f552e0> <<< 13830 1727204070.20876: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f4b9d0> <<< 13830 1727204070.20895: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f3fb20> <<< 13830 1727204070.20919: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12816a0> <<< 13830 1727204070.20944: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 13830 1727204070.21012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 13830 1727204070.21052: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f4bb80> <<< 13830 1727204070.21188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 13830 1727204070.21210: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7ca0e66760> <<< 13830 1727204070.21410: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip' # zipimport: zlib available <<< 13830 1727204070.21540: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.21570: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/__init__.py <<< 13830 1727204070.21584: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.21596: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.21608: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 13830 1727204070.21618: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.23745: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.25099: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 13830 1727204070.25117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 13830 1727204070.25142: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d8b0> <<< 13830 1727204070.25273: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 13830 1727204070.25277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc'<<< 13830 1727204070.25280: stdout chunk (state=3): >>> <<< 13830 1727204070.25381: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 13830 1727204070.25384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 13830 1727204070.25386: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d8d160> <<< 13830 1727204070.25588: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d280> <<< 13830 1727204070.25591: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d5e0> <<< 13830 1727204070.25593: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 13830 1727204070.25595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc'<<< 13830 1727204070.25597: stdout chunk (state=3): >>> <<< 13830 1727204070.25635: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d4f0> <<< 13830 1727204070.25649: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8de20> <<< 13830 1727204070.25687: stdout chunk (state=3): >>>import 'atexit' # <<< 13830 1727204070.25751: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.25758: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d8d580> <<< 13830 1727204070.25800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py<<< 13830 1727204070.25804: stdout chunk (state=3): >>> <<< 13830 1727204070.25844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 13830 1727204070.25919: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d100> <<< 13830 1727204070.25943: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 13830 1727204070.25974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 13830 1727204070.26016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 13830 1727204070.26056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 13830 1727204070.26097: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 13830 1727204070.26110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 13830 1727204070.26246: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca07adfd0> <<< 13830 1727204070.26304: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.26329: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca07cbc40><<< 13830 1727204070.26338: stdout chunk (state=3): >>> <<< 13830 1727204070.26399: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.26413: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca07cbf40> <<< 13830 1727204070.26459: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 13830 1727204070.26510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 13830 1727204070.26577: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca07cb2e0> <<< 13830 1727204070.26613: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0df5d90> <<< 13830 1727204070.26918: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0df53a0> <<< 13830 1727204070.26953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc'<<< 13830 1727204070.26986: stdout chunk (state=3): >>> <<< 13830 1727204070.27047: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0df5f40> <<< 13830 1727204070.27051: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 13830 1727204070.27080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 13830 1727204070.27117: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 13830 1727204070.27133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 13830 1727204070.27166: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 13830 1727204070.27192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 13830 1727204070.27237: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 13830 1727204070.27247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 13830 1727204070.27272: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0e66a90> <<< 13830 1727204070.27395: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d61dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d61490> <<< 13830 1727204070.27419: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d97580> <<< 13830 1727204070.27457: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.27494: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.27499: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d615b0> <<< 13830 1727204070.27557: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc'<<< 13830 1727204070.27570: stdout chunk (state=3): >>> <<< 13830 1727204070.27586: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d615e0> <<< 13830 1727204070.27628: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 13830 1727204070.27654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc'<<< 13830 1727204070.27660: stdout chunk (state=3): >>> <<< 13830 1727204070.27697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py<<< 13830 1727204070.27703: stdout chunk (state=3): >>> <<< 13830 1727204070.27750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc'<<< 13830 1727204070.27756: stdout chunk (state=3): >>> <<< 13830 1727204070.27858: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.27885: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.27899: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079ef70><<< 13830 1727204070.27911: stdout chunk (state=3): >>> <<< 13830 1727204070.27928: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dd52e0> <<< 13830 1727204070.27959: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 13830 1727204070.27994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc'<<< 13830 1727204070.27999: stdout chunk (state=3): >>> <<< 13830 1727204070.28081: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.28093: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079b7f0><<< 13830 1727204070.28114: stdout chunk (state=3): >>> <<< 13830 1727204070.28128: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dd5460> <<< 13830 1727204070.28169: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 13830 1727204070.28236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204070.28277: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py<<< 13830 1727204070.28301: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc'<<< 13830 1727204070.28313: stdout chunk (state=3): >>> <<< 13830 1727204070.28327: stdout chunk (state=3): >>>import '_string' # <<< 13830 1727204070.28425: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dedf40> <<< 13830 1727204070.28656: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca079b790><<< 13830 1727204070.28662: stdout chunk (state=3): >>> <<< 13830 1727204070.28797: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.28830: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.28839: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079b5e0> <<< 13830 1727204070.28886: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.28934: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079a550> <<< 13830 1727204070.29119: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.29122: stdout chunk (state=3): >>> <<< 13830 1727204070.29124: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079a490> <<< 13830 1727204070.29129: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dce9a0> <<< 13830 1727204070.29134: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 13830 1727204070.29136: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 13830 1727204070.29153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 13830 1727204070.29222: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.29246: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d566a0> <<< 13830 1727204070.29559: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.29586: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d54bb0><<< 13830 1727204070.29593: stdout chunk (state=3): >>> <<< 13830 1727204070.29627: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d650d0> <<< 13830 1727204070.29674: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.29696: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.29720: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d56100><<< 13830 1727204070.29739: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d99c40> <<< 13830 1727204070.29760: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.29777: stdout chunk (state=3): >>> <<< 13830 1727204070.29791: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.29804: stdout chunk (state=3): >>> <<< 13830 1727204070.29817: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 13830 1727204070.29856: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.29986: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.30109: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.30115: stdout chunk (state=3): >>> <<< 13830 1727204070.30144: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.30171: stdout chunk (state=3): >>> import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 13830 1727204070.30208: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.30219: stdout chunk (state=3): >>> <<< 13830 1727204070.30244: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py<<< 13830 1727204070.30247: stdout chunk (state=3): >>> <<< 13830 1727204070.30277: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.30452: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.30624: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.31422: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.32220: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 13830 1727204070.32249: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 13830 1727204070.32280: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 13830 1727204070.32295: stdout chunk (state=3): >>> import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 13830 1727204070.32339: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 13830 1727204070.32376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 13830 1727204070.32461: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 13830 1727204070.32488: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0766940> <<< 13830 1727204070.32584: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 13830 1727204070.32604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 13830 1727204070.32624: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d53d30> <<< 13830 1727204070.32648: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d4a7c0> <<< 13830 1727204070.32715: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 13830 1727204070.32740: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.32774: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.32801: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/_text.py<<< 13830 1727204070.32810: stdout chunk (state=3): >>> <<< 13830 1727204070.32833: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.33044: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.33260: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 13830 1727204070.33279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 13830 1727204070.33324: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d544c0><<< 13830 1727204070.33345: stdout chunk (state=3): >>> <<< 13830 1727204070.33352: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.34054: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.34694: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.34814: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.34818: stdout chunk (state=3): >>> <<< 13830 1727204070.34910: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 13830 1727204070.34955: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.34958: stdout chunk (state=3): >>> <<< 13830 1727204070.35009: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.35086: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py<<< 13830 1727204070.35103: stdout chunk (state=3): >>> <<< 13830 1727204070.35115: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.35221: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.35327: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 13830 1727204070.35371: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13830 1727204070.35386: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 13830 1727204070.35424: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.35433: stdout chunk (state=3): >>> <<< 13830 1727204070.35498: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.35501: stdout chunk (state=3): >>> <<< 13830 1727204070.35552: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 13830 1727204070.35559: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.35877: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.36187: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 13830 1727204070.36257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 13830 1727204070.36275: stdout chunk (state=3): >>>import '_ast' # <<< 13830 1727204070.36395: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca030c940> <<< 13830 1727204070.36401: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.36506: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.36635: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py<<< 13830 1727204070.36643: stdout chunk (state=3): >>> import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/validation.py<<< 13830 1727204070.36652: stdout chunk (state=3): >>> import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py<<< 13830 1727204070.36670: stdout chunk (state=3): >>> <<< 13830 1727204070.36683: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 13830 1727204070.36712: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.36770: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.36834: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 13830 1727204070.36852: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.36929: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.36989: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.37135: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.37240: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 13830 1727204070.37291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc'<<< 13830 1727204070.37297: stdout chunk (state=3): >>> <<< 13830 1727204070.37446: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so'<<< 13830 1727204070.37475: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0de0b50> <<< 13830 1727204070.37536: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca030b070><<< 13830 1727204070.37539: stdout chunk (state=3): >>> <<< 13830 1727204070.37609: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 13830 1727204070.37652: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.37655: stdout chunk (state=3): >>> <<< 13830 1727204070.37867: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.37952: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.37958: stdout chunk (state=3): >>> <<< 13830 1727204070.37983: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.37989: stdout chunk (state=3): >>> <<< 13830 1727204070.38045: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py<<< 13830 1727204070.38049: stdout chunk (state=3): >>> <<< 13830 1727204070.38063: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc'<<< 13830 1727204070.38074: stdout chunk (state=3): >>> <<< 13830 1727204070.38104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py<<< 13830 1727204070.38108: stdout chunk (state=3): >>> <<< 13830 1727204070.38160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc'<<< 13830 1727204070.38174: stdout chunk (state=3): >>> <<< 13830 1727204070.38197: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py<<< 13830 1727204070.38200: stdout chunk (state=3): >>> <<< 13830 1727204070.38235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 13830 1727204070.38398: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca035c6d0><<< 13830 1727204070.38401: stdout chunk (state=3): >>> <<< 13830 1727204070.38474: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca075dc10> <<< 13830 1727204070.38599: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca075c5b0> <<< 13830 1727204070.38603: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 13830 1727204070.38643: stdout chunk (state=3): >>># zipimport: zlib available<<< 13830 1727204070.38646: stdout chunk (state=3): >>> <<< 13830 1727204070.38701: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.38734: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 13830 1727204070.38848: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 13830 1727204070.38878: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.38931: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.38936: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 13830 1727204070.38956: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.39139: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.39429: stdout chunk (state=3): >>># zipimport: zlib available <<< 13830 1727204070.39676: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 13830 1727204070.39718: stdout chunk (state=3): >>># destroy __main__ <<< 13830 1727204070.40086: stdout chunk (state=3): >>># clear builtins._ <<< 13830 1727204070.40090: stdout chunk (state=3): >>># clear sys.path<<< 13830 1727204070.40161: stdout chunk (state=3): >>> # clear sys.argv <<< 13830 1727204070.40185: stdout chunk (state=3): >>># clear sys.ps1 <<< 13830 1727204070.40189: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_type # clear sys.last_value <<< 13830 1727204070.40220: stdout chunk (state=3): >>># clear sys.last_traceback <<< 13830 1727204070.40224: stdout chunk (state=3): >>># clear sys.path_hooks <<< 13830 1727204070.40291: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path<<< 13830 1727204070.40323: stdout chunk (state=3): >>> <<< 13830 1727204070.40338: stdout chunk (state=3): >>># clear sys.__interactivehook__<<< 13830 1727204070.40360: stdout chunk (state=3): >>> # restore sys.stdin <<< 13830 1727204070.40362: stdout chunk (state=3): >>># restore sys.stdout <<< 13830 1727204070.40453: stdout chunk (state=3): >>># restore sys.stderr<<< 13830 1727204070.40540: stdout chunk (state=3): >>> # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat<<< 13830 1727204070.40580: stdout chunk (state=3): >>> # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path <<< 13830 1727204070.40670: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale<<< 13830 1727204070.40744: stdout chunk (state=3): >>> # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants <<< 13830 1727204070.40850: stdout chunk (state=3): >>># destroy sre_constants <<< 13830 1727204070.40938: stdout chunk (state=3): >>># cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools<<< 13830 1727204070.41021: stdout chunk (state=3): >>> # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct<<< 13830 1727204070.41106: stdout chunk (state=3): >>> # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings<<< 13830 1727204070.41130: stdout chunk (state=3): >>> # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing<<< 13830 1727204070.41154: stdout chunk (state=3): >>> # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma<<< 13830 1727204070.41197: stdout chunk (state=3): >>> # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal <<< 13830 1727204070.41243: stdout chunk (state=3): >>># cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal<<< 13830 1727204070.41284: stdout chunk (state=3): >>> # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text<<< 13830 1727204070.41289: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text <<< 13830 1727204070.41321: stdout chunk (state=3): >>># destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec <<< 13830 1727204070.41349: stdout chunk (state=3): >>># destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules<<< 13830 1727204070.41650: stdout chunk (state=3): >>> # destroy _sitebuiltins <<< 13830 1727204070.41705: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 13830 1727204070.41767: stdout chunk (state=3): >>># destroy zipimport <<< 13830 1727204070.41809: stdout chunk (state=3): >>># destroy _compression<<< 13830 1727204070.41812: stdout chunk (state=3): >>> <<< 13830 1727204070.41814: stdout chunk (state=3): >>># destroy binascii<<< 13830 1727204070.41816: stdout chunk (state=3): >>> # destroy importlib<<< 13830 1727204070.41818: stdout chunk (state=3): >>> # destroy struct # destroy bz2<<< 13830 1727204070.41855: stdout chunk (state=3): >>> <<< 13830 1727204070.41867: stdout chunk (state=3): >>># destroy lzma <<< 13830 1727204070.41888: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal<<< 13830 1727204070.41939: stdout chunk (state=3): >>> # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 13830 1727204070.41979: stdout chunk (state=3): >>># destroy json.scanner <<< 13830 1727204070.41986: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 13830 1727204070.42053: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 13830 1727204070.42145: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 13830 1727204070.42168: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging<<< 13830 1727204070.42171: stdout chunk (state=3): >>> # destroy argparse <<< 13830 1727204070.42294: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 13830 1727204070.42392: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid<<< 13830 1727204070.42464: stdout chunk (state=3): >>> # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 13830 1727204070.42531: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors <<< 13830 1727204070.42610: stdout chunk (state=3): >>># cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl <<< 13830 1727204070.42676: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 13830 1727204070.42719: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp <<< 13830 1727204070.42732: stdout chunk (state=3): >>># cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib<<< 13830 1727204070.42763: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 13830 1727204070.42795: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools<<< 13830 1727204070.42828: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator<<< 13830 1727204070.42854: stdout chunk (state=3): >>> # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale<<< 13830 1727204070.42896: stdout chunk (state=3): >>> # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path<<< 13830 1727204070.42935: stdout chunk (state=3): >>> # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 13830 1727204070.42965: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 13830 1727204070.43002: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 13830 1727204070.43017: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime<<< 13830 1727204070.43226: stdout chunk (state=3): >>> # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform <<< 13830 1727204070.43274: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse <<< 13830 1727204070.43280: stdout chunk (state=3): >>># destroy tokenize <<< 13830 1727204070.43301: stdout chunk (state=3): >>># destroy _heapq <<< 13830 1727204070.43322: stdout chunk (state=3): >>># destroy posixpath <<< 13830 1727204070.43349: stdout chunk (state=3): >>># destroy stat <<< 13830 1727204070.43382: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 13830 1727204070.43385: stdout chunk (state=3): >>># destroy errno <<< 13830 1727204070.43412: stdout chunk (state=3): >>># destroy signal <<< 13830 1727204070.43415: stdout chunk (state=3): >>># destroy contextlib<<< 13830 1727204070.43442: stdout chunk (state=3): >>> <<< 13830 1727204070.43448: stdout chunk (state=3): >>># destroy pwd <<< 13830 1727204070.43463: stdout chunk (state=3): >>># destroy grp <<< 13830 1727204070.43478: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy selectors <<< 13830 1727204070.43514: stdout chunk (state=3): >>># destroy select<<< 13830 1727204070.43548: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error<<< 13830 1727204070.43571: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response<<< 13830 1727204070.43597: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 13830 1727204070.43615: stdout chunk (state=3): >>> # destroy itertools <<< 13830 1727204070.43636: stdout chunk (state=3): >>># destroy operator <<< 13830 1727204070.43648: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves <<< 13830 1727204070.43661: stdout chunk (state=3): >>># destroy _operator <<< 13830 1727204070.43684: stdout chunk (state=3): >>># destroy _frozen_importlib_external <<< 13830 1727204070.43718: stdout chunk (state=3): >>># destroy _imp # destroy io <<< 13830 1727204070.43728: stdout chunk (state=3): >>># destroy marshal <<< 13830 1727204070.43775: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 13830 1727204070.43785: stdout chunk (state=3): >>># clear sys.audit hooks <<< 13830 1727204070.44229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204070.44287: stderr chunk (state=3): >>><<< 13830 1727204070.44290: stdout chunk (state=3): >>><<< 13830 1727204070.44356: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13f3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13f3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca134f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca134f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1372850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca134f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca13b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1348d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1372d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1398970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12edf10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12f40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12e75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12ee6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12ed3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca1271eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12719a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1271fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1271df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12c9e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12c1700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12d5760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12f5eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca1281d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12c9340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca12d5370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12fba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1255460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1255550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12330d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1284b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12844c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f6e2b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1240d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1284fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12fb0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f7ebe0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f7ef10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f91820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f91d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f1f490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f7ef40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f2f370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f916a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f2f430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca1281ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4b790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4ba60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f4b850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4b940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f4bd90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0f552e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f4b9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f3fb20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca12816a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0f4bb80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7ca0e66760> # zipimport: found 30 names in '/tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d8d160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8de20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d8d580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d8d100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca07adfd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca07cbc40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca07cbf40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca07cb2e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0df5d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0df53a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0df5f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0e66a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d61dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d61490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d97580> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d615b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d615e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079ef70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dd52e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079b7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dd5460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dedf40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca079b790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079b5e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079a550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca079a490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0dce9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d566a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d54bb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d650d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0d56100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d99c40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0766940> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d53d30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d4a7c0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca0d544c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca030c940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ca0de0b50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca030b070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca035c6d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca075dc10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ca075c5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_4zqitmyi/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 13830 1727204070.44921: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204070.44924: _low_level_execute_command(): starting 13830 1727204070.44927: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204069.6994636-14047-275745195108011/ > /dev/null 2>&1 && sleep 0' 13830 1727204070.45074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204070.45082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.45085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.45092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.45138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.45141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.45143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.45205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204070.45208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.45210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.45507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.47948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204070.48007: stderr chunk (state=3): >>><<< 13830 1727204070.48010: stdout chunk (state=3): >>><<< 13830 1727204070.48031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204070.48036: handler run complete 13830 1727204070.48050: attempt loop complete, returning result 13830 1727204070.48053: _execute() done 13830 1727204070.48055: dumping result to json 13830 1727204070.48058: done dumping result, returning 13830 1727204070.48067: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [0affcd87-79f5-1659-6b02-000000000028] 13830 1727204070.48072: sending task result for task 0affcd87-79f5-1659-6b02-000000000028 13830 1727204070.48164: done sending task result for task 0affcd87-79f5-1659-6b02-000000000028 13830 1727204070.48167: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 13830 1727204070.48225: no more pending results, returning what we have 13830 1727204070.48231: results queue empty 13830 1727204070.48231: checking for any_errors_fatal 13830 1727204070.48238: done checking for any_errors_fatal 13830 1727204070.48239: checking for max_fail_percentage 13830 1727204070.48240: done checking for max_fail_percentage 13830 1727204070.48241: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.48241: done checking to see if all hosts have failed 13830 1727204070.48242: getting the remaining hosts for this loop 13830 1727204070.48244: done getting the remaining hosts for this loop 13830 1727204070.48248: getting the next task for host managed-node3 13830 1727204070.48253: done getting next task for host managed-node3 13830 1727204070.48256: ^ task is: TASK: Set flag to indicate system is ostree 13830 1727204070.48258: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.48261: getting variables 13830 1727204070.48263: in VariableManager get_vars() 13830 1727204070.48294: Calling all_inventory to load vars for managed-node3 13830 1727204070.48296: Calling groups_inventory to load vars for managed-node3 13830 1727204070.48299: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.48309: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.48311: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.48313: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.48483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.48598: done with get_vars() 13830 1727204070.48606: done getting variables 13830 1727204070.48678: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.885) 0:00:03.565 ***** 13830 1727204070.48701: entering _queue_task() for managed-node3/set_fact 13830 1727204070.48702: Creating lock for set_fact 13830 1727204070.48905: worker is 1 (out of 1 available) 13830 1727204070.48918: exiting _queue_task() for managed-node3/set_fact 13830 1727204070.48930: done queuing things up, now waiting for results queue to drain 13830 1727204070.48931: waiting for pending results... 13830 1727204070.49083: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 13830 1727204070.49147: in run() - task 0affcd87-79f5-1659-6b02-000000000029 13830 1727204070.49157: variable 'ansible_search_path' from source: unknown 13830 1727204070.49160: variable 'ansible_search_path' from source: unknown 13830 1727204070.49191: calling self._execute() 13830 1727204070.49250: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.49253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.49260: variable 'omit' from source: magic vars 13830 1727204070.49647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204070.49819: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204070.49851: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204070.49878: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204070.49907: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204070.49971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204070.49989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204070.50009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204070.50032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204070.50118: Evaluated conditional (not __network_is_ostree is defined): True 13830 1727204070.50126: variable 'omit' from source: magic vars 13830 1727204070.50154: variable 'omit' from source: magic vars 13830 1727204070.50244: variable '__ostree_booted_stat' from source: set_fact 13830 1727204070.50284: variable 'omit' from source: magic vars 13830 1727204070.50302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204070.50322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204070.50339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204070.50353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204070.50363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204070.50386: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204070.50389: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.50391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.50457: Set connection var ansible_connection to ssh 13830 1727204070.50467: Set connection var ansible_timeout to 10 13830 1727204070.50475: Set connection var ansible_shell_executable to /bin/sh 13830 1727204070.50477: Set connection var ansible_shell_type to sh 13830 1727204070.50480: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204070.50488: Set connection var ansible_pipelining to False 13830 1727204070.50504: variable 'ansible_shell_executable' from source: unknown 13830 1727204070.50507: variable 'ansible_connection' from source: unknown 13830 1727204070.50510: variable 'ansible_module_compression' from source: unknown 13830 1727204070.50512: variable 'ansible_shell_type' from source: unknown 13830 1727204070.50514: variable 'ansible_shell_executable' from source: unknown 13830 1727204070.50517: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.50520: variable 'ansible_pipelining' from source: unknown 13830 1727204070.50525: variable 'ansible_timeout' from source: unknown 13830 1727204070.50527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.50600: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204070.50608: variable 'omit' from source: magic vars 13830 1727204070.50611: starting attempt loop 13830 1727204070.50614: running the handler 13830 1727204070.50623: handler run complete 13830 1727204070.50633: attempt loop complete, returning result 13830 1727204070.50635: _execute() done 13830 1727204070.50637: dumping result to json 13830 1727204070.50639: done dumping result, returning 13830 1727204070.50644: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [0affcd87-79f5-1659-6b02-000000000029] 13830 1727204070.50648: sending task result for task 0affcd87-79f5-1659-6b02-000000000029 13830 1727204070.50732: done sending task result for task 0affcd87-79f5-1659-6b02-000000000029 13830 1727204070.50735: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 13830 1727204070.50794: no more pending results, returning what we have 13830 1727204070.50798: results queue empty 13830 1727204070.50799: checking for any_errors_fatal 13830 1727204070.50806: done checking for any_errors_fatal 13830 1727204070.50807: checking for max_fail_percentage 13830 1727204070.50809: done checking for max_fail_percentage 13830 1727204070.50809: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.50810: done checking to see if all hosts have failed 13830 1727204070.50811: getting the remaining hosts for this loop 13830 1727204070.50812: done getting the remaining hosts for this loop 13830 1727204070.50816: getting the next task for host managed-node3 13830 1727204070.50823: done getting next task for host managed-node3 13830 1727204070.50826: ^ task is: TASK: Fix CentOS6 Base repo 13830 1727204070.50831: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.50834: getting variables 13830 1727204070.50835: in VariableManager get_vars() 13830 1727204070.50862: Calling all_inventory to load vars for managed-node3 13830 1727204070.50866: Calling groups_inventory to load vars for managed-node3 13830 1727204070.50869: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.50883: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.50886: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.50898: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.51046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.51163: done with get_vars() 13830 1727204070.51171: done getting variables 13830 1727204070.51271: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.025) 0:00:03.591 ***** 13830 1727204070.51291: entering _queue_task() for managed-node3/copy 13830 1727204070.51494: worker is 1 (out of 1 available) 13830 1727204070.51509: exiting _queue_task() for managed-node3/copy 13830 1727204070.51519: done queuing things up, now waiting for results queue to drain 13830 1727204070.51525: waiting for pending results... 13830 1727204070.51681: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 13830 1727204070.51740: in run() - task 0affcd87-79f5-1659-6b02-00000000002b 13830 1727204070.51752: variable 'ansible_search_path' from source: unknown 13830 1727204070.51755: variable 'ansible_search_path' from source: unknown 13830 1727204070.51785: calling self._execute() 13830 1727204070.51838: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.51841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.51850: variable 'omit' from source: magic vars 13830 1727204070.52195: variable 'ansible_distribution' from source: facts 13830 1727204070.52213: Evaluated conditional (ansible_distribution == 'CentOS'): True 13830 1727204070.52296: variable 'ansible_distribution_major_version' from source: facts 13830 1727204070.52299: Evaluated conditional (ansible_distribution_major_version == '6'): False 13830 1727204070.52302: when evaluation is False, skipping this task 13830 1727204070.52305: _execute() done 13830 1727204070.52307: dumping result to json 13830 1727204070.52310: done dumping result, returning 13830 1727204070.52322: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [0affcd87-79f5-1659-6b02-00000000002b] 13830 1727204070.52325: sending task result for task 0affcd87-79f5-1659-6b02-00000000002b 13830 1727204070.52410: done sending task result for task 0affcd87-79f5-1659-6b02-00000000002b 13830 1727204070.52412: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13830 1727204070.52495: no more pending results, returning what we have 13830 1727204070.52498: results queue empty 13830 1727204070.52499: checking for any_errors_fatal 13830 1727204070.52503: done checking for any_errors_fatal 13830 1727204070.52504: checking for max_fail_percentage 13830 1727204070.52505: done checking for max_fail_percentage 13830 1727204070.52506: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.52506: done checking to see if all hosts have failed 13830 1727204070.52507: getting the remaining hosts for this loop 13830 1727204070.52509: done getting the remaining hosts for this loop 13830 1727204070.52512: getting the next task for host managed-node3 13830 1727204070.52517: done getting next task for host managed-node3 13830 1727204070.52519: ^ task is: TASK: Include the task 'enable_epel.yml' 13830 1727204070.52521: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.52524: getting variables 13830 1727204070.52525: in VariableManager get_vars() 13830 1727204070.52556: Calling all_inventory to load vars for managed-node3 13830 1727204070.52558: Calling groups_inventory to load vars for managed-node3 13830 1727204070.52560: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.52569: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.52570: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.52572: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.52684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.52821: done with get_vars() 13830 1727204070.52827: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.016) 0:00:03.607 ***** 13830 1727204070.52897: entering _queue_task() for managed-node3/include_tasks 13830 1727204070.53084: worker is 1 (out of 1 available) 13830 1727204070.53097: exiting _queue_task() for managed-node3/include_tasks 13830 1727204070.53107: done queuing things up, now waiting for results queue to drain 13830 1727204070.53108: waiting for pending results... 13830 1727204070.53285: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 13830 1727204070.53386: in run() - task 0affcd87-79f5-1659-6b02-00000000002c 13830 1727204070.53402: variable 'ansible_search_path' from source: unknown 13830 1727204070.53415: variable 'ansible_search_path' from source: unknown 13830 1727204070.53449: calling self._execute() 13830 1727204070.53522: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.53533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.53545: variable 'omit' from source: magic vars 13830 1727204070.54007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204070.55855: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204070.55909: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204070.55937: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204070.55962: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204070.55984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204070.56046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204070.56067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204070.56085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204070.56113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204070.56124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204070.56210: variable '__network_is_ostree' from source: set_fact 13830 1727204070.56231: Evaluated conditional (not __network_is_ostree | d(false)): True 13830 1727204070.56235: _execute() done 13830 1727204070.56239: dumping result to json 13830 1727204070.56242: done dumping result, returning 13830 1727204070.56244: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-1659-6b02-00000000002c] 13830 1727204070.56247: sending task result for task 0affcd87-79f5-1659-6b02-00000000002c 13830 1727204070.56341: done sending task result for task 0affcd87-79f5-1659-6b02-00000000002c 13830 1727204070.56343: WORKER PROCESS EXITING 13830 1727204070.56378: no more pending results, returning what we have 13830 1727204070.56384: in VariableManager get_vars() 13830 1727204070.56418: Calling all_inventory to load vars for managed-node3 13830 1727204070.56420: Calling groups_inventory to load vars for managed-node3 13830 1727204070.56423: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.56435: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.56438: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.56441: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.56585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.56698: done with get_vars() 13830 1727204070.56704: variable 'ansible_search_path' from source: unknown 13830 1727204070.56705: variable 'ansible_search_path' from source: unknown 13830 1727204070.56732: we have included files to process 13830 1727204070.56733: generating all_blocks data 13830 1727204070.56734: done generating all_blocks data 13830 1727204070.56738: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13830 1727204070.56739: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13830 1727204070.56740: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13830 1727204070.57237: done processing included file 13830 1727204070.57238: iterating over new_blocks loaded from include file 13830 1727204070.57239: in VariableManager get_vars() 13830 1727204070.57248: done with get_vars() 13830 1727204070.57249: filtering new block on tags 13830 1727204070.57263: done filtering new block on tags 13830 1727204070.57266: in VariableManager get_vars() 13830 1727204070.57273: done with get_vars() 13830 1727204070.57274: filtering new block on tags 13830 1727204070.57280: done filtering new block on tags 13830 1727204070.57282: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 13830 1727204070.57286: extending task lists for all hosts with included blocks 13830 1727204070.57352: done extending task lists 13830 1727204070.57353: done processing included files 13830 1727204070.57353: results queue empty 13830 1727204070.57354: checking for any_errors_fatal 13830 1727204070.57357: done checking for any_errors_fatal 13830 1727204070.57357: checking for max_fail_percentage 13830 1727204070.57358: done checking for max_fail_percentage 13830 1727204070.57358: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.57359: done checking to see if all hosts have failed 13830 1727204070.57359: getting the remaining hosts for this loop 13830 1727204070.57360: done getting the remaining hosts for this loop 13830 1727204070.57362: getting the next task for host managed-node3 13830 1727204070.57366: done getting next task for host managed-node3 13830 1727204070.57367: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 13830 1727204070.57369: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.57371: getting variables 13830 1727204070.57371: in VariableManager get_vars() 13830 1727204070.57377: Calling all_inventory to load vars for managed-node3 13830 1727204070.57379: Calling groups_inventory to load vars for managed-node3 13830 1727204070.57380: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.57384: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.57389: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.57391: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.57476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.57596: done with get_vars() 13830 1727204070.57602: done getting variables 13830 1727204070.57654: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 13830 1727204070.57791: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.049) 0:00:03.656 ***** 13830 1727204070.57823: entering _queue_task() for managed-node3/command 13830 1727204070.57824: Creating lock for command 13830 1727204070.58046: worker is 1 (out of 1 available) 13830 1727204070.58058: exiting _queue_task() for managed-node3/command 13830 1727204070.58070: done queuing things up, now waiting for results queue to drain 13830 1727204070.58072: waiting for pending results... 13830 1727204070.58226: running TaskExecutor() for managed-node3/TASK: Create EPEL 9 13830 1727204070.58296: in run() - task 0affcd87-79f5-1659-6b02-000000000046 13830 1727204070.58305: variable 'ansible_search_path' from source: unknown 13830 1727204070.58308: variable 'ansible_search_path' from source: unknown 13830 1727204070.58338: calling self._execute() 13830 1727204070.58390: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.58394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.58403: variable 'omit' from source: magic vars 13830 1727204070.58667: variable 'ansible_distribution' from source: facts 13830 1727204070.58676: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13830 1727204070.58766: variable 'ansible_distribution_major_version' from source: facts 13830 1727204070.58770: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13830 1727204070.58773: when evaluation is False, skipping this task 13830 1727204070.58776: _execute() done 13830 1727204070.58778: dumping result to json 13830 1727204070.58783: done dumping result, returning 13830 1727204070.58788: done running TaskExecutor() for managed-node3/TASK: Create EPEL 9 [0affcd87-79f5-1659-6b02-000000000046] 13830 1727204070.58794: sending task result for task 0affcd87-79f5-1659-6b02-000000000046 13830 1727204070.58888: done sending task result for task 0affcd87-79f5-1659-6b02-000000000046 13830 1727204070.58890: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13830 1727204070.58945: no more pending results, returning what we have 13830 1727204070.58948: results queue empty 13830 1727204070.58949: checking for any_errors_fatal 13830 1727204070.58951: done checking for any_errors_fatal 13830 1727204070.58951: checking for max_fail_percentage 13830 1727204070.58953: done checking for max_fail_percentage 13830 1727204070.58953: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.58954: done checking to see if all hosts have failed 13830 1727204070.58955: getting the remaining hosts for this loop 13830 1727204070.58956: done getting the remaining hosts for this loop 13830 1727204070.58960: getting the next task for host managed-node3 13830 1727204070.58967: done getting next task for host managed-node3 13830 1727204070.58969: ^ task is: TASK: Install yum-utils package 13830 1727204070.58973: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.58976: getting variables 13830 1727204070.58977: in VariableManager get_vars() 13830 1727204070.59010: Calling all_inventory to load vars for managed-node3 13830 1727204070.59013: Calling groups_inventory to load vars for managed-node3 13830 1727204070.59016: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.59024: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.59026: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.59029: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.59139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.59256: done with get_vars() 13830 1727204070.59263: done getting variables 13830 1727204070.59338: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.015) 0:00:03.671 ***** 13830 1727204070.59359: entering _queue_task() for managed-node3/package 13830 1727204070.59360: Creating lock for package 13830 1727204070.59561: worker is 1 (out of 1 available) 13830 1727204070.59575: exiting _queue_task() for managed-node3/package 13830 1727204070.59585: done queuing things up, now waiting for results queue to drain 13830 1727204070.59587: waiting for pending results... 13830 1727204070.59736: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 13830 1727204070.59809: in run() - task 0affcd87-79f5-1659-6b02-000000000047 13830 1727204070.59817: variable 'ansible_search_path' from source: unknown 13830 1727204070.59821: variable 'ansible_search_path' from source: unknown 13830 1727204070.59851: calling self._execute() 13830 1727204070.59961: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.59967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.59976: variable 'omit' from source: magic vars 13830 1727204070.60239: variable 'ansible_distribution' from source: facts 13830 1727204070.60252: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13830 1727204070.60340: variable 'ansible_distribution_major_version' from source: facts 13830 1727204070.60344: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13830 1727204070.60347: when evaluation is False, skipping this task 13830 1727204070.60350: _execute() done 13830 1727204070.60353: dumping result to json 13830 1727204070.60357: done dumping result, returning 13830 1727204070.60364: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [0affcd87-79f5-1659-6b02-000000000047] 13830 1727204070.60371: sending task result for task 0affcd87-79f5-1659-6b02-000000000047 13830 1727204070.60455: done sending task result for task 0affcd87-79f5-1659-6b02-000000000047 13830 1727204070.60458: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13830 1727204070.60509: no more pending results, returning what we have 13830 1727204070.60513: results queue empty 13830 1727204070.60514: checking for any_errors_fatal 13830 1727204070.60520: done checking for any_errors_fatal 13830 1727204070.60521: checking for max_fail_percentage 13830 1727204070.60522: done checking for max_fail_percentage 13830 1727204070.60523: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.60524: done checking to see if all hosts have failed 13830 1727204070.60525: getting the remaining hosts for this loop 13830 1727204070.60526: done getting the remaining hosts for this loop 13830 1727204070.60530: getting the next task for host managed-node3 13830 1727204070.60536: done getting next task for host managed-node3 13830 1727204070.60538: ^ task is: TASK: Enable EPEL 7 13830 1727204070.60542: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.60545: getting variables 13830 1727204070.60547: in VariableManager get_vars() 13830 1727204070.60625: Calling all_inventory to load vars for managed-node3 13830 1727204070.60627: Calling groups_inventory to load vars for managed-node3 13830 1727204070.60630: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.60637: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.60639: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.60640: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.60744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.60861: done with get_vars() 13830 1727204070.60869: done getting variables 13830 1727204070.60914: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.015) 0:00:03.687 ***** 13830 1727204070.60936: entering _queue_task() for managed-node3/command 13830 1727204070.61138: worker is 1 (out of 1 available) 13830 1727204070.61153: exiting _queue_task() for managed-node3/command 13830 1727204070.61166: done queuing things up, now waiting for results queue to drain 13830 1727204070.61168: waiting for pending results... 13830 1727204070.61316: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 13830 1727204070.61392: in run() - task 0affcd87-79f5-1659-6b02-000000000048 13830 1727204070.61402: variable 'ansible_search_path' from source: unknown 13830 1727204070.61406: variable 'ansible_search_path' from source: unknown 13830 1727204070.61436: calling self._execute() 13830 1727204070.61489: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.61494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.61503: variable 'omit' from source: magic vars 13830 1727204070.61777: variable 'ansible_distribution' from source: facts 13830 1727204070.61788: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13830 1727204070.61877: variable 'ansible_distribution_major_version' from source: facts 13830 1727204070.61881: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13830 1727204070.61884: when evaluation is False, skipping this task 13830 1727204070.61888: _execute() done 13830 1727204070.61891: dumping result to json 13830 1727204070.61893: done dumping result, returning 13830 1727204070.61903: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [0affcd87-79f5-1659-6b02-000000000048] 13830 1727204070.61906: sending task result for task 0affcd87-79f5-1659-6b02-000000000048 13830 1727204070.61984: done sending task result for task 0affcd87-79f5-1659-6b02-000000000048 13830 1727204070.61987: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13830 1727204070.62041: no more pending results, returning what we have 13830 1727204070.62044: results queue empty 13830 1727204070.62045: checking for any_errors_fatal 13830 1727204070.62051: done checking for any_errors_fatal 13830 1727204070.62052: checking for max_fail_percentage 13830 1727204070.62054: done checking for max_fail_percentage 13830 1727204070.62054: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.62055: done checking to see if all hosts have failed 13830 1727204070.62056: getting the remaining hosts for this loop 13830 1727204070.62057: done getting the remaining hosts for this loop 13830 1727204070.62061: getting the next task for host managed-node3 13830 1727204070.62069: done getting next task for host managed-node3 13830 1727204070.62071: ^ task is: TASK: Enable EPEL 8 13830 1727204070.62075: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.62077: getting variables 13830 1727204070.62079: in VariableManager get_vars() 13830 1727204070.62103: Calling all_inventory to load vars for managed-node3 13830 1727204070.62105: Calling groups_inventory to load vars for managed-node3 13830 1727204070.62108: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.62125: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.62128: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.62131: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.62244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.62360: done with get_vars() 13830 1727204070.62369: done getting variables 13830 1727204070.62410: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.014) 0:00:03.702 ***** 13830 1727204070.62433: entering _queue_task() for managed-node3/command 13830 1727204070.62620: worker is 1 (out of 1 available) 13830 1727204070.62635: exiting _queue_task() for managed-node3/command 13830 1727204070.62646: done queuing things up, now waiting for results queue to drain 13830 1727204070.62648: waiting for pending results... 13830 1727204070.62789: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 13830 1727204070.62852: in run() - task 0affcd87-79f5-1659-6b02-000000000049 13830 1727204070.62861: variable 'ansible_search_path' from source: unknown 13830 1727204070.62868: variable 'ansible_search_path' from source: unknown 13830 1727204070.62897: calling self._execute() 13830 1727204070.62998: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.63001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.63008: variable 'omit' from source: magic vars 13830 1727204070.63267: variable 'ansible_distribution' from source: facts 13830 1727204070.63277: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13830 1727204070.63368: variable 'ansible_distribution_major_version' from source: facts 13830 1727204070.63372: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13830 1727204070.63375: when evaluation is False, skipping this task 13830 1727204070.63378: _execute() done 13830 1727204070.63380: dumping result to json 13830 1727204070.63382: done dumping result, returning 13830 1727204070.63388: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [0affcd87-79f5-1659-6b02-000000000049] 13830 1727204070.63394: sending task result for task 0affcd87-79f5-1659-6b02-000000000049 13830 1727204070.63475: done sending task result for task 0affcd87-79f5-1659-6b02-000000000049 13830 1727204070.63478: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13830 1727204070.63573: no more pending results, returning what we have 13830 1727204070.63576: results queue empty 13830 1727204070.63577: checking for any_errors_fatal 13830 1727204070.63580: done checking for any_errors_fatal 13830 1727204070.63581: checking for max_fail_percentage 13830 1727204070.63582: done checking for max_fail_percentage 13830 1727204070.63583: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.63583: done checking to see if all hosts have failed 13830 1727204070.63584: getting the remaining hosts for this loop 13830 1727204070.63585: done getting the remaining hosts for this loop 13830 1727204070.63596: getting the next task for host managed-node3 13830 1727204070.63602: done getting next task for host managed-node3 13830 1727204070.63603: ^ task is: TASK: Enable EPEL 6 13830 1727204070.63606: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.63609: getting variables 13830 1727204070.63610: in VariableManager get_vars() 13830 1727204070.63624: Calling all_inventory to load vars for managed-node3 13830 1727204070.63626: Calling groups_inventory to load vars for managed-node3 13830 1727204070.63630: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.63636: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.63638: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.63639: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.63738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.63851: done with get_vars() 13830 1727204070.63857: done getting variables 13830 1727204070.63898: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.014) 0:00:03.717 ***** 13830 1727204070.63920: entering _queue_task() for managed-node3/copy 13830 1727204070.64093: worker is 1 (out of 1 available) 13830 1727204070.64106: exiting _queue_task() for managed-node3/copy 13830 1727204070.64117: done queuing things up, now waiting for results queue to drain 13830 1727204070.64119: waiting for pending results... 13830 1727204070.64263: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 13830 1727204070.64323: in run() - task 0affcd87-79f5-1659-6b02-00000000004b 13830 1727204070.64335: variable 'ansible_search_path' from source: unknown 13830 1727204070.64338: variable 'ansible_search_path' from source: unknown 13830 1727204070.64371: calling self._execute() 13830 1727204070.64418: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.64421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.64429: variable 'omit' from source: magic vars 13830 1727204070.64691: variable 'ansible_distribution' from source: facts 13830 1727204070.64701: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13830 1727204070.64779: variable 'ansible_distribution_major_version' from source: facts 13830 1727204070.64785: Evaluated conditional (ansible_distribution_major_version == '6'): False 13830 1727204070.64790: when evaluation is False, skipping this task 13830 1727204070.64792: _execute() done 13830 1727204070.64795: dumping result to json 13830 1727204070.64797: done dumping result, returning 13830 1727204070.64809: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [0affcd87-79f5-1659-6b02-00000000004b] 13830 1727204070.64811: sending task result for task 0affcd87-79f5-1659-6b02-00000000004b 13830 1727204070.64895: done sending task result for task 0affcd87-79f5-1659-6b02-00000000004b 13830 1727204070.64898: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13830 1727204070.64958: no more pending results, returning what we have 13830 1727204070.64961: results queue empty 13830 1727204070.64962: checking for any_errors_fatal 13830 1727204070.64967: done checking for any_errors_fatal 13830 1727204070.64968: checking for max_fail_percentage 13830 1727204070.64969: done checking for max_fail_percentage 13830 1727204070.64970: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.64970: done checking to see if all hosts have failed 13830 1727204070.64971: getting the remaining hosts for this loop 13830 1727204070.64972: done getting the remaining hosts for this loop 13830 1727204070.64975: getting the next task for host managed-node3 13830 1727204070.64981: done getting next task for host managed-node3 13830 1727204070.64984: ^ task is: TASK: Set network provider to 'nm' 13830 1727204070.64986: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.64989: getting variables 13830 1727204070.64990: in VariableManager get_vars() 13830 1727204070.65009: Calling all_inventory to load vars for managed-node3 13830 1727204070.65010: Calling groups_inventory to load vars for managed-node3 13830 1727204070.65012: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.65025: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.65027: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.65031: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.65136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.65268: done with get_vars() 13830 1727204070.65275: done getting variables 13830 1727204070.65314: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:13 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.014) 0:00:03.731 ***** 13830 1727204070.65335: entering _queue_task() for managed-node3/set_fact 13830 1727204070.65505: worker is 1 (out of 1 available) 13830 1727204070.65517: exiting _queue_task() for managed-node3/set_fact 13830 1727204070.65531: done queuing things up, now waiting for results queue to drain 13830 1727204070.65532: waiting for pending results... 13830 1727204070.65673: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 13830 1727204070.65728: in run() - task 0affcd87-79f5-1659-6b02-000000000007 13830 1727204070.65739: variable 'ansible_search_path' from source: unknown 13830 1727204070.65769: calling self._execute() 13830 1727204070.65822: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.65826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.65836: variable 'omit' from source: magic vars 13830 1727204070.65914: variable 'omit' from source: magic vars 13830 1727204070.65938: variable 'omit' from source: magic vars 13830 1727204070.65961: variable 'omit' from source: magic vars 13830 1727204070.65998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204070.66028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204070.66046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204070.66059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204070.66070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204070.66093: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204070.66096: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.66101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.66168: Set connection var ansible_connection to ssh 13830 1727204070.66177: Set connection var ansible_timeout to 10 13830 1727204070.66183: Set connection var ansible_shell_executable to /bin/sh 13830 1727204070.66186: Set connection var ansible_shell_type to sh 13830 1727204070.66188: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204070.66197: Set connection var ansible_pipelining to False 13830 1727204070.66215: variable 'ansible_shell_executable' from source: unknown 13830 1727204070.66218: variable 'ansible_connection' from source: unknown 13830 1727204070.66222: variable 'ansible_module_compression' from source: unknown 13830 1727204070.66225: variable 'ansible_shell_type' from source: unknown 13830 1727204070.66227: variable 'ansible_shell_executable' from source: unknown 13830 1727204070.66230: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.66232: variable 'ansible_pipelining' from source: unknown 13830 1727204070.66239: variable 'ansible_timeout' from source: unknown 13830 1727204070.66242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.66344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204070.66353: variable 'omit' from source: magic vars 13830 1727204070.66356: starting attempt loop 13830 1727204070.66359: running the handler 13830 1727204070.66370: handler run complete 13830 1727204070.66378: attempt loop complete, returning result 13830 1727204070.66380: _execute() done 13830 1727204070.66383: dumping result to json 13830 1727204070.66385: done dumping result, returning 13830 1727204070.66391: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [0affcd87-79f5-1659-6b02-000000000007] 13830 1727204070.66396: sending task result for task 0affcd87-79f5-1659-6b02-000000000007 13830 1727204070.66481: done sending task result for task 0affcd87-79f5-1659-6b02-000000000007 13830 1727204070.66484: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 13830 1727204070.66556: no more pending results, returning what we have 13830 1727204070.66559: results queue empty 13830 1727204070.66559: checking for any_errors_fatal 13830 1727204070.66569: done checking for any_errors_fatal 13830 1727204070.66570: checking for max_fail_percentage 13830 1727204070.66572: done checking for max_fail_percentage 13830 1727204070.66572: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.66576: done checking to see if all hosts have failed 13830 1727204070.66578: getting the remaining hosts for this loop 13830 1727204070.66579: done getting the remaining hosts for this loop 13830 1727204070.66582: getting the next task for host managed-node3 13830 1727204070.66585: done getting next task for host managed-node3 13830 1727204070.66587: ^ task is: TASK: meta (flush_handlers) 13830 1727204070.66588: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.66591: getting variables 13830 1727204070.66592: in VariableManager get_vars() 13830 1727204070.66610: Calling all_inventory to load vars for managed-node3 13830 1727204070.66612: Calling groups_inventory to load vars for managed-node3 13830 1727204070.66614: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.66620: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.66622: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.66623: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.66730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.66858: done with get_vars() 13830 1727204070.66866: done getting variables 13830 1727204070.66919: in VariableManager get_vars() 13830 1727204070.66925: Calling all_inventory to load vars for managed-node3 13830 1727204070.66927: Calling groups_inventory to load vars for managed-node3 13830 1727204070.66928: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.66932: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.66933: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.66935: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.67018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.67149: done with get_vars() 13830 1727204070.67158: done queuing things up, now waiting for results queue to drain 13830 1727204070.67159: results queue empty 13830 1727204070.67160: checking for any_errors_fatal 13830 1727204070.67161: done checking for any_errors_fatal 13830 1727204070.67162: checking for max_fail_percentage 13830 1727204070.67162: done checking for max_fail_percentage 13830 1727204070.67163: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.67163: done checking to see if all hosts have failed 13830 1727204070.67165: getting the remaining hosts for this loop 13830 1727204070.67166: done getting the remaining hosts for this loop 13830 1727204070.67168: getting the next task for host managed-node3 13830 1727204070.67171: done getting next task for host managed-node3 13830 1727204070.67172: ^ task is: TASK: meta (flush_handlers) 13830 1727204070.67172: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.67178: getting variables 13830 1727204070.67179: in VariableManager get_vars() 13830 1727204070.67184: Calling all_inventory to load vars for managed-node3 13830 1727204070.67185: Calling groups_inventory to load vars for managed-node3 13830 1727204070.67187: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.67190: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.67191: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.67193: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.67275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.67381: done with get_vars() 13830 1727204070.67386: done getting variables 13830 1727204070.67415: in VariableManager get_vars() 13830 1727204070.67421: Calling all_inventory to load vars for managed-node3 13830 1727204070.67422: Calling groups_inventory to load vars for managed-node3 13830 1727204070.67423: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.67426: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.67427: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.67430: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.67511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.67617: done with get_vars() 13830 1727204070.67625: done queuing things up, now waiting for results queue to drain 13830 1727204070.67626: results queue empty 13830 1727204070.67626: checking for any_errors_fatal 13830 1727204070.67627: done checking for any_errors_fatal 13830 1727204070.67628: checking for max_fail_percentage 13830 1727204070.67628: done checking for max_fail_percentage 13830 1727204070.67629: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.67630: done checking to see if all hosts have failed 13830 1727204070.67630: getting the remaining hosts for this loop 13830 1727204070.67631: done getting the remaining hosts for this loop 13830 1727204070.67632: getting the next task for host managed-node3 13830 1727204070.67634: done getting next task for host managed-node3 13830 1727204070.67635: ^ task is: None 13830 1727204070.67636: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.67636: done queuing things up, now waiting for results queue to drain 13830 1727204070.67637: results queue empty 13830 1727204070.67637: checking for any_errors_fatal 13830 1727204070.67638: done checking for any_errors_fatal 13830 1727204070.67638: checking for max_fail_percentage 13830 1727204070.67639: done checking for max_fail_percentage 13830 1727204070.67639: checking to see if all hosts have failed and the running result is not ok 13830 1727204070.67640: done checking to see if all hosts have failed 13830 1727204070.67641: getting the next task for host managed-node3 13830 1727204070.67642: done getting next task for host managed-node3 13830 1727204070.67643: ^ task is: None 13830 1727204070.67643: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.67684: in VariableManager get_vars() 13830 1727204070.67695: done with get_vars() 13830 1727204070.67699: in VariableManager get_vars() 13830 1727204070.67705: done with get_vars() 13830 1727204070.67708: variable 'omit' from source: magic vars 13830 1727204070.67730: in VariableManager get_vars() 13830 1727204070.67736: done with get_vars() 13830 1727204070.67750: variable 'omit' from source: magic vars PLAY [Play for testing bond options] ******************************************* 13830 1727204070.67944: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 13830 1727204070.67967: getting the remaining hosts for this loop 13830 1727204070.67968: done getting the remaining hosts for this loop 13830 1727204070.67969: getting the next task for host managed-node3 13830 1727204070.67971: done getting next task for host managed-node3 13830 1727204070.67973: ^ task is: TASK: Gathering Facts 13830 1727204070.67973: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204070.67975: getting variables 13830 1727204070.67975: in VariableManager get_vars() 13830 1727204070.67981: Calling all_inventory to load vars for managed-node3 13830 1727204070.67982: Calling groups_inventory to load vars for managed-node3 13830 1727204070.67984: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204070.67988: Calling all_plugins_play to load vars for managed-node3 13830 1727204070.67998: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204070.68000: Calling groups_plugins_play to load vars for managed-node3 13830 1727204070.68080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204070.68185: done with get_vars() 13830 1727204070.68190: done getting variables 13830 1727204070.68219: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.029) 0:00:03.760 ***** 13830 1727204070.68237: entering _queue_task() for managed-node3/gather_facts 13830 1727204070.68417: worker is 1 (out of 1 available) 13830 1727204070.68430: exiting _queue_task() for managed-node3/gather_facts 13830 1727204070.68440: done queuing things up, now waiting for results queue to drain 13830 1727204070.68442: waiting for pending results... 13830 1727204070.68589: running TaskExecutor() for managed-node3/TASK: Gathering Facts 13830 1727204070.68656: in run() - task 0affcd87-79f5-1659-6b02-000000000071 13830 1727204070.68663: variable 'ansible_search_path' from source: unknown 13830 1727204070.68692: calling self._execute() 13830 1727204070.68746: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.68749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.68762: variable 'omit' from source: magic vars 13830 1727204070.69018: variable 'ansible_distribution_major_version' from source: facts 13830 1727204070.69029: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204070.69036: variable 'omit' from source: magic vars 13830 1727204070.69053: variable 'omit' from source: magic vars 13830 1727204070.69078: variable 'omit' from source: magic vars 13830 1727204070.69118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204070.69144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204070.69160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204070.69174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204070.69184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204070.69213: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204070.69218: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.69222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.69287: Set connection var ansible_connection to ssh 13830 1727204070.69298: Set connection var ansible_timeout to 10 13830 1727204070.69303: Set connection var ansible_shell_executable to /bin/sh 13830 1727204070.69311: Set connection var ansible_shell_type to sh 13830 1727204070.69318: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204070.69326: Set connection var ansible_pipelining to False 13830 1727204070.69345: variable 'ansible_shell_executable' from source: unknown 13830 1727204070.69348: variable 'ansible_connection' from source: unknown 13830 1727204070.69350: variable 'ansible_module_compression' from source: unknown 13830 1727204070.69353: variable 'ansible_shell_type' from source: unknown 13830 1727204070.69355: variable 'ansible_shell_executable' from source: unknown 13830 1727204070.69357: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204070.69361: variable 'ansible_pipelining' from source: unknown 13830 1727204070.69363: variable 'ansible_timeout' from source: unknown 13830 1727204070.69370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204070.69500: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204070.69507: variable 'omit' from source: magic vars 13830 1727204070.69512: starting attempt loop 13830 1727204070.69514: running the handler 13830 1727204070.69528: variable 'ansible_facts' from source: unknown 13830 1727204070.69546: _low_level_execute_command(): starting 13830 1727204070.69553: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204070.70097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.70114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.70126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204070.70138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.70150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.70200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.70214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.70275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.72625: stdout chunk (state=3): >>>/root <<< 13830 1727204070.72780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204070.72834: stderr chunk (state=3): >>><<< 13830 1727204070.72838: stdout chunk (state=3): >>><<< 13830 1727204070.72863: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204070.72872: _low_level_execute_command(): starting 13830 1727204070.72880: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581 `" && echo ansible-tmp-1727204070.7285802-14143-278377396201581="` echo /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581 `" ) && sleep 0' 13830 1727204070.73336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.73349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.73373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.73401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.73445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.73457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.73515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.76202: stdout chunk (state=3): >>>ansible-tmp-1727204070.7285802-14143-278377396201581=/root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581 <<< 13830 1727204070.76366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204070.76445: stderr chunk (state=3): >>><<< 13830 1727204070.76499: stdout chunk (state=3): >>><<< 13830 1727204070.76607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204070.7285802-14143-278377396201581=/root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204070.76706: variable 'ansible_module_compression' from source: unknown 13830 1727204070.76973: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13830 1727204070.76977: variable 'ansible_facts' from source: unknown 13830 1727204070.77029: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581/AnsiballZ_setup.py 13830 1727204070.77211: Sending initial data 13830 1727204070.77214: Sent initial data (154 bytes) 13830 1727204070.78398: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204070.78416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.78436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.78457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.78499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.78510: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204070.78523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.78547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204070.78558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204070.78575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204070.78590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.78603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.78616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.78627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.78643: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204070.78656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.78737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204070.78756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.78772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.78981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.81642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204070.81689: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204070.81737: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpq5z13bce /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581/AnsiballZ_setup.py <<< 13830 1727204070.82178: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204070.84581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204070.84774: stderr chunk (state=3): >>><<< 13830 1727204070.84778: stdout chunk (state=3): >>><<< 13830 1727204070.84780: done transferring module to remote 13830 1727204070.84782: _low_level_execute_command(): starting 13830 1727204070.84784: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581/ /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581/AnsiballZ_setup.py && sleep 0' 13830 1727204070.86622: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204070.86640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.86656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.86678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.86721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.86734: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204070.86750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.86768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204070.86780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204070.86793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204070.86805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.86820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.86837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.86852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.86868: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204070.86882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.86956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204070.86988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.87003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.87084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204070.89583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204070.89658: stderr chunk (state=3): >>><<< 13830 1727204070.89661: stdout chunk (state=3): >>><<< 13830 1727204070.89755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204070.89759: _low_level_execute_command(): starting 13830 1727204070.89763: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581/AnsiballZ_setup.py && sleep 0' 13830 1727204070.90938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204070.90955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.90973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.90994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.91040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.91055: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204070.91074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.91092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204070.91104: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204070.91116: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204070.91129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204070.91143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204070.91158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204070.91172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204070.91186: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204070.91200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204070.91278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204070.91306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204070.91323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204070.91414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204071.57994: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "31", "epoch": "1727204071", "epoch_int": "1727204071", "date": "2024-09-24", "time": "14:54:31", "iso8601_micro": "2024-09-24T18:54:31.276359Z", "iso8601": "2024-09-24T18:54:31Z", "iso8601_basic": "20240924T145431276359", "iso8601_basic_short": "20240924T145431", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "an<<< 13830 1727204071.58015: stdout chunk (state=3): >>>sible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2787, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 745, "free": 2787}, "nocache": {"free": 3245, "used": 287}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fb<<< 13830 1727204071.58050: stdout chunk (state=3): >>>d4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 417, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282296320, "block_size": 4096, "block_total": 65519355, "block_available": 64522045, "block_used": 997310, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_loadavg": {"1m": 0.59, "5m": 0.33, "15m": 0.14}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hs<<< 13830 1727204071.58347: stdout chunk (state=3): >>>r_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13830 1727204071.60548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204071.60552: stderr chunk (state=3): >>><<< 13830 1727204071.60554: stdout chunk (state=3): >>><<< 13830 1727204071.60681: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "31", "epoch": "1727204071", "epoch_int": "1727204071", "date": "2024-09-24", "time": "14:54:31", "iso8601_micro": "2024-09-24T18:54:31.276359Z", "iso8601": "2024-09-24T18:54:31Z", "iso8601_basic": "20240924T145431276359", "iso8601_basic_short": "20240924T145431", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2787, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 745, "free": 2787}, "nocache": {"free": 3245, "used": 287}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 417, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282296320, "block_size": 4096, "block_total": 65519355, "block_available": 64522045, "block_used": 997310, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_loadavg": {"1m": 0.59, "5m": 0.33, "15m": 0.14}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204071.61023: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204071.61053: _low_level_execute_command(): starting 13830 1727204071.61066: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204070.7285802-14143-278377396201581/ > /dev/null 2>&1 && sleep 0' 13830 1727204071.62905: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204071.62922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204071.62942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204071.62962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204071.63016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204071.63032: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204071.63049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204071.63070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204071.63091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204071.63104: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204071.63116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204071.63134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204071.63152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204071.63168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204071.63181: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204071.63203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204071.63389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204071.63422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204071.63445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204071.63539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204071.66092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204071.66127: stderr chunk (state=3): >>><<< 13830 1727204071.66134: stdout chunk (state=3): >>><<< 13830 1727204071.66275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204071.66278: handler run complete 13830 1727204071.66383: variable 'ansible_facts' from source: unknown 13830 1727204071.66441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.66778: variable 'ansible_facts' from source: unknown 13830 1727204071.66867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.66992: attempt loop complete, returning result 13830 1727204071.67001: _execute() done 13830 1727204071.67007: dumping result to json 13830 1727204071.67046: done dumping result, returning 13830 1727204071.67058: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-1659-6b02-000000000071] 13830 1727204071.67071: sending task result for task 0affcd87-79f5-1659-6b02-000000000071 ok: [managed-node3] 13830 1727204071.67954: no more pending results, returning what we have 13830 1727204071.67957: results queue empty 13830 1727204071.67958: checking for any_errors_fatal 13830 1727204071.67959: done checking for any_errors_fatal 13830 1727204071.67960: checking for max_fail_percentage 13830 1727204071.67961: done checking for max_fail_percentage 13830 1727204071.67962: checking to see if all hosts have failed and the running result is not ok 13830 1727204071.67963: done checking to see if all hosts have failed 13830 1727204071.67965: getting the remaining hosts for this loop 13830 1727204071.67967: done getting the remaining hosts for this loop 13830 1727204071.67970: getting the next task for host managed-node3 13830 1727204071.67977: done getting next task for host managed-node3 13830 1727204071.67979: ^ task is: TASK: meta (flush_handlers) 13830 1727204071.67981: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204071.67983: getting variables 13830 1727204071.67985: in VariableManager get_vars() 13830 1727204071.68011: Calling all_inventory to load vars for managed-node3 13830 1727204071.68014: Calling groups_inventory to load vars for managed-node3 13830 1727204071.68017: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.68031: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.68033: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.68036: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.68357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.68824: done with get_vars() 13830 1727204071.68836: done getting variables 13830 1727204071.68888: done sending task result for task 0affcd87-79f5-1659-6b02-000000000071 13830 1727204071.68891: WORKER PROCESS EXITING 13830 1727204071.68939: in VariableManager get_vars() 13830 1727204071.68948: Calling all_inventory to load vars for managed-node3 13830 1727204071.68950: Calling groups_inventory to load vars for managed-node3 13830 1727204071.68952: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.68956: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.68958: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.68967: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.69087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.69253: done with get_vars() 13830 1727204071.69266: done queuing things up, now waiting for results queue to drain 13830 1727204071.69267: results queue empty 13830 1727204071.69268: checking for any_errors_fatal 13830 1727204071.69271: done checking for any_errors_fatal 13830 1727204071.69271: checking for max_fail_percentage 13830 1727204071.69272: done checking for max_fail_percentage 13830 1727204071.69273: checking to see if all hosts have failed and the running result is not ok 13830 1727204071.69274: done checking to see if all hosts have failed 13830 1727204071.69274: getting the remaining hosts for this loop 13830 1727204071.69275: done getting the remaining hosts for this loop 13830 1727204071.69277: getting the next task for host managed-node3 13830 1727204071.69280: done getting next task for host managed-node3 13830 1727204071.69282: ^ task is: TASK: Show playbook name 13830 1727204071.69284: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204071.69286: getting variables 13830 1727204071.69287: in VariableManager get_vars() 13830 1727204071.69294: Calling all_inventory to load vars for managed-node3 13830 1727204071.69296: Calling groups_inventory to load vars for managed-node3 13830 1727204071.69298: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.69302: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.69304: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.69307: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.69444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.69636: done with get_vars() 13830 1727204071.69643: done getting variables 13830 1727204071.69715: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:32 Tuesday 24 September 2024 14:54:31 -0400 (0:00:01.015) 0:00:04.775 ***** 13830 1727204071.69743: entering _queue_task() for managed-node3/debug 13830 1727204071.69745: Creating lock for debug 13830 1727204071.70044: worker is 1 (out of 1 available) 13830 1727204071.70055: exiting _queue_task() for managed-node3/debug 13830 1727204071.70067: done queuing things up, now waiting for results queue to drain 13830 1727204071.70068: waiting for pending results... 13830 1727204071.70326: running TaskExecutor() for managed-node3/TASK: Show playbook name 13830 1727204071.70426: in run() - task 0affcd87-79f5-1659-6b02-00000000000b 13830 1727204071.70450: variable 'ansible_search_path' from source: unknown 13830 1727204071.70497: calling self._execute() 13830 1727204071.70697: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.70709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.70723: variable 'omit' from source: magic vars 13830 1727204071.71125: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.71146: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.71156: variable 'omit' from source: magic vars 13830 1727204071.71192: variable 'omit' from source: magic vars 13830 1727204071.71241: variable 'omit' from source: magic vars 13830 1727204071.71292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204071.71340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.71371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204071.71397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.71414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.71455: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.71467: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.71475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.71569: Set connection var ansible_connection to ssh 13830 1727204071.71585: Set connection var ansible_timeout to 10 13830 1727204071.71594: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.71601: Set connection var ansible_shell_type to sh 13830 1727204071.71611: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.71622: Set connection var ansible_pipelining to False 13830 1727204071.71649: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.71659: variable 'ansible_connection' from source: unknown 13830 1727204071.71666: variable 'ansible_module_compression' from source: unknown 13830 1727204071.71672: variable 'ansible_shell_type' from source: unknown 13830 1727204071.71679: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.71685: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.71692: variable 'ansible_pipelining' from source: unknown 13830 1727204071.71698: variable 'ansible_timeout' from source: unknown 13830 1727204071.71706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.71868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.71889: variable 'omit' from source: magic vars 13830 1727204071.71899: starting attempt loop 13830 1727204071.71906: running the handler 13830 1727204071.71962: handler run complete 13830 1727204071.71998: attempt loop complete, returning result 13830 1727204071.72006: _execute() done 13830 1727204071.72014: dumping result to json 13830 1727204071.72021: done dumping result, returning 13830 1727204071.72035: done running TaskExecutor() for managed-node3/TASK: Show playbook name [0affcd87-79f5-1659-6b02-00000000000b] 13830 1727204071.72048: sending task result for task 0affcd87-79f5-1659-6b02-00000000000b ok: [managed-node3] => {} MSG: this is: playbooks/tests_bond_options.yml 13830 1727204071.72297: no more pending results, returning what we have 13830 1727204071.72301: results queue empty 13830 1727204071.72302: checking for any_errors_fatal 13830 1727204071.72304: done checking for any_errors_fatal 13830 1727204071.72305: checking for max_fail_percentage 13830 1727204071.72307: done checking for max_fail_percentage 13830 1727204071.72308: checking to see if all hosts have failed and the running result is not ok 13830 1727204071.72309: done checking to see if all hosts have failed 13830 1727204071.72309: getting the remaining hosts for this loop 13830 1727204071.72311: done getting the remaining hosts for this loop 13830 1727204071.72315: getting the next task for host managed-node3 13830 1727204071.72323: done getting next task for host managed-node3 13830 1727204071.72325: ^ task is: TASK: Include the task 'run_test.yml' 13830 1727204071.72330: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204071.72334: getting variables 13830 1727204071.72335: in VariableManager get_vars() 13830 1727204071.72362: Calling all_inventory to load vars for managed-node3 13830 1727204071.72366: Calling groups_inventory to load vars for managed-node3 13830 1727204071.72371: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.72381: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.72384: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.72387: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.72553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.72766: done with get_vars() 13830 1727204071.72898: done getting variables 13830 1727204071.72933: done sending task result for task 0affcd87-79f5-1659-6b02-00000000000b 13830 1727204071.72936: WORKER PROCESS EXITING TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:42 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.032) 0:00:04.808 ***** 13830 1727204071.73013: entering _queue_task() for managed-node3/include_tasks 13830 1727204071.73474: worker is 1 (out of 1 available) 13830 1727204071.73485: exiting _queue_task() for managed-node3/include_tasks 13830 1727204071.73498: done queuing things up, now waiting for results queue to drain 13830 1727204071.73499: waiting for pending results... 13830 1727204071.73762: running TaskExecutor() for managed-node3/TASK: Include the task 'run_test.yml' 13830 1727204071.73868: in run() - task 0affcd87-79f5-1659-6b02-00000000000d 13830 1727204071.73891: variable 'ansible_search_path' from source: unknown 13830 1727204071.73933: calling self._execute() 13830 1727204071.74022: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.74037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.74055: variable 'omit' from source: magic vars 13830 1727204071.74479: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.74501: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.74510: _execute() done 13830 1727204071.74517: dumping result to json 13830 1727204071.74523: done dumping result, returning 13830 1727204071.74537: done running TaskExecutor() for managed-node3/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1659-6b02-00000000000d] 13830 1727204071.74546: sending task result for task 0affcd87-79f5-1659-6b02-00000000000d 13830 1727204071.74684: no more pending results, returning what we have 13830 1727204071.74690: in VariableManager get_vars() 13830 1727204071.74726: Calling all_inventory to load vars for managed-node3 13830 1727204071.74730: Calling groups_inventory to load vars for managed-node3 13830 1727204071.74734: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.74746: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.74749: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.74752: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.74956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.75205: done with get_vars() 13830 1727204071.75213: variable 'ansible_search_path' from source: unknown 13830 1727204071.75227: we have included files to process 13830 1727204071.75231: generating all_blocks data 13830 1727204071.75232: done generating all_blocks data 13830 1727204071.75233: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 13830 1727204071.75233: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 13830 1727204071.75236: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 13830 1727204071.75819: done sending task result for task 0affcd87-79f5-1659-6b02-00000000000d 13830 1727204071.75822: WORKER PROCESS EXITING 13830 1727204071.76099: in VariableManager get_vars() 13830 1727204071.76114: done with get_vars() 13830 1727204071.76176: in VariableManager get_vars() 13830 1727204071.76191: done with get_vars() 13830 1727204071.76226: in VariableManager get_vars() 13830 1727204071.76240: done with get_vars() 13830 1727204071.76281: in VariableManager get_vars() 13830 1727204071.76294: done with get_vars() 13830 1727204071.76331: in VariableManager get_vars() 13830 1727204071.76343: done with get_vars() 13830 1727204071.76683: in VariableManager get_vars() 13830 1727204071.76696: done with get_vars() 13830 1727204071.76710: done processing included file 13830 1727204071.76712: iterating over new_blocks loaded from include file 13830 1727204071.76713: in VariableManager get_vars() 13830 1727204071.76725: done with get_vars() 13830 1727204071.76726: filtering new block on tags 13830 1727204071.76830: done filtering new block on tags 13830 1727204071.76834: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node3 13830 1727204071.76840: extending task lists for all hosts with included blocks 13830 1727204071.76878: done extending task lists 13830 1727204071.76879: done processing included files 13830 1727204071.76880: results queue empty 13830 1727204071.76881: checking for any_errors_fatal 13830 1727204071.76884: done checking for any_errors_fatal 13830 1727204071.76885: checking for max_fail_percentage 13830 1727204071.76886: done checking for max_fail_percentage 13830 1727204071.76887: checking to see if all hosts have failed and the running result is not ok 13830 1727204071.76888: done checking to see if all hosts have failed 13830 1727204071.76888: getting the remaining hosts for this loop 13830 1727204071.76890: done getting the remaining hosts for this loop 13830 1727204071.76892: getting the next task for host managed-node3 13830 1727204071.76896: done getting next task for host managed-node3 13830 1727204071.76898: ^ task is: TASK: TEST: {{ lsr_description }} 13830 1727204071.76901: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204071.76903: getting variables 13830 1727204071.76904: in VariableManager get_vars() 13830 1727204071.76911: Calling all_inventory to load vars for managed-node3 13830 1727204071.76913: Calling groups_inventory to load vars for managed-node3 13830 1727204071.76915: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.76920: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.76926: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.76932: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.77092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.77280: done with get_vars() 13830 1727204071.77288: done getting variables 13830 1727204071.77324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204071.77443: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.044) 0:00:04.852 ***** 13830 1727204071.77487: entering _queue_task() for managed-node3/debug 13830 1727204071.77786: worker is 1 (out of 1 available) 13830 1727204071.77801: exiting _queue_task() for managed-node3/debug 13830 1727204071.77810: done queuing things up, now waiting for results queue to drain 13830 1727204071.77812: waiting for pending results... 13830 1727204071.78070: running TaskExecutor() for managed-node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 13830 1727204071.78176: in run() - task 0affcd87-79f5-1659-6b02-000000000088 13830 1727204071.78195: variable 'ansible_search_path' from source: unknown 13830 1727204071.78202: variable 'ansible_search_path' from source: unknown 13830 1727204071.78247: calling self._execute() 13830 1727204071.78323: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.78340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.78354: variable 'omit' from source: magic vars 13830 1727204071.78734: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.78753: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.78762: variable 'omit' from source: magic vars 13830 1727204071.78810: variable 'omit' from source: magic vars 13830 1727204071.78918: variable 'lsr_description' from source: include params 13830 1727204071.78939: variable 'omit' from source: magic vars 13830 1727204071.78982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204071.79033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.79060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204071.79086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.79107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.79147: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.79155: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.79161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.79272: Set connection var ansible_connection to ssh 13830 1727204071.79288: Set connection var ansible_timeout to 10 13830 1727204071.79296: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.79302: Set connection var ansible_shell_type to sh 13830 1727204071.79311: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.79326: Set connection var ansible_pipelining to False 13830 1727204071.79354: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.79361: variable 'ansible_connection' from source: unknown 13830 1727204071.79368: variable 'ansible_module_compression' from source: unknown 13830 1727204071.79374: variable 'ansible_shell_type' from source: unknown 13830 1727204071.79379: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.79383: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.79389: variable 'ansible_pipelining' from source: unknown 13830 1727204071.79394: variable 'ansible_timeout' from source: unknown 13830 1727204071.79399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.79553: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.79573: variable 'omit' from source: magic vars 13830 1727204071.79582: starting attempt loop 13830 1727204071.79588: running the handler 13830 1727204071.79640: handler run complete 13830 1727204071.79670: attempt loop complete, returning result 13830 1727204071.79680: _execute() done 13830 1727204071.79688: dumping result to json 13830 1727204071.79695: done dumping result, returning 13830 1727204071.79707: done running TaskExecutor() for managed-node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0affcd87-79f5-1659-6b02-000000000088] 13830 1727204071.79719: sending task result for task 0affcd87-79f5-1659-6b02-000000000088 ok: [managed-node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 13830 1727204071.79882: no more pending results, returning what we have 13830 1727204071.79887: results queue empty 13830 1727204071.79888: checking for any_errors_fatal 13830 1727204071.79890: done checking for any_errors_fatal 13830 1727204071.79890: checking for max_fail_percentage 13830 1727204071.79892: done checking for max_fail_percentage 13830 1727204071.79893: checking to see if all hosts have failed and the running result is not ok 13830 1727204071.79894: done checking to see if all hosts have failed 13830 1727204071.79895: getting the remaining hosts for this loop 13830 1727204071.79896: done getting the remaining hosts for this loop 13830 1727204071.79901: getting the next task for host managed-node3 13830 1727204071.79907: done getting next task for host managed-node3 13830 1727204071.79910: ^ task is: TASK: Show item 13830 1727204071.79914: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204071.79917: getting variables 13830 1727204071.79919: in VariableManager get_vars() 13830 1727204071.79955: Calling all_inventory to load vars for managed-node3 13830 1727204071.79959: Calling groups_inventory to load vars for managed-node3 13830 1727204071.79962: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.79975: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.79978: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.79981: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.80179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.80389: done with get_vars() 13830 1727204071.80405: done getting variables 13830 1727204071.80470: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.030) 0:00:04.883 ***** 13830 1727204071.80573: entering _queue_task() for managed-node3/debug 13830 1727204071.80616: done sending task result for task 0affcd87-79f5-1659-6b02-000000000088 13830 1727204071.80626: WORKER PROCESS EXITING 13830 1727204071.81132: worker is 1 (out of 1 available) 13830 1727204071.81147: exiting _queue_task() for managed-node3/debug 13830 1727204071.81161: done queuing things up, now waiting for results queue to drain 13830 1727204071.81163: waiting for pending results... 13830 1727204071.81435: running TaskExecutor() for managed-node3/TASK: Show item 13830 1727204071.81548: in run() - task 0affcd87-79f5-1659-6b02-000000000089 13830 1727204071.81567: variable 'ansible_search_path' from source: unknown 13830 1727204071.81575: variable 'ansible_search_path' from source: unknown 13830 1727204071.81640: variable 'omit' from source: magic vars 13830 1727204071.81781: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.81796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.81819: variable 'omit' from source: magic vars 13830 1727204071.82184: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.82201: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.82212: variable 'omit' from source: magic vars 13830 1727204071.82253: variable 'omit' from source: magic vars 13830 1727204071.82571: variable 'item' from source: unknown 13830 1727204071.82657: variable 'item' from source: unknown 13830 1727204071.82682: variable 'omit' from source: magic vars 13830 1727204071.82726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204071.82778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.82803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204071.82825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.82877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.82909: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.82956: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.82982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.83173: Set connection var ansible_connection to ssh 13830 1727204071.83285: Set connection var ansible_timeout to 10 13830 1727204071.83301: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.83309: Set connection var ansible_shell_type to sh 13830 1727204071.83318: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.83336: Set connection var ansible_pipelining to False 13830 1727204071.83361: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.83372: variable 'ansible_connection' from source: unknown 13830 1727204071.83406: variable 'ansible_module_compression' from source: unknown 13830 1727204071.83416: variable 'ansible_shell_type' from source: unknown 13830 1727204071.83423: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.83431: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.83476: variable 'ansible_pipelining' from source: unknown 13830 1727204071.83483: variable 'ansible_timeout' from source: unknown 13830 1727204071.83492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.83798: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.83813: variable 'omit' from source: magic vars 13830 1727204071.83822: starting attempt loop 13830 1727204071.83837: running the handler 13830 1727204071.83995: variable 'lsr_description' from source: include params 13830 1727204071.84093: variable 'lsr_description' from source: include params 13830 1727204071.84175: handler run complete 13830 1727204071.84196: attempt loop complete, returning result 13830 1727204071.84273: variable 'item' from source: unknown 13830 1727204071.84347: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 13830 1727204071.84788: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.84830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.84850: variable 'omit' from source: magic vars 13830 1727204071.85054: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.85070: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.85079: variable 'omit' from source: magic vars 13830 1727204071.85098: variable 'omit' from source: magic vars 13830 1727204071.85150: variable 'item' from source: unknown 13830 1727204071.85220: variable 'item' from source: unknown 13830 1727204071.85249: variable 'omit' from source: magic vars 13830 1727204071.85278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.85293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.85306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.85325: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.85334: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.85340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.85413: Set connection var ansible_connection to ssh 13830 1727204071.85425: Set connection var ansible_timeout to 10 13830 1727204071.85436: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.85442: Set connection var ansible_shell_type to sh 13830 1727204071.85450: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.85470: Set connection var ansible_pipelining to False 13830 1727204071.85492: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.85498: variable 'ansible_connection' from source: unknown 13830 1727204071.85503: variable 'ansible_module_compression' from source: unknown 13830 1727204071.85508: variable 'ansible_shell_type' from source: unknown 13830 1727204071.85513: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.85519: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.85532: variable 'ansible_pipelining' from source: unknown 13830 1727204071.85540: variable 'ansible_timeout' from source: unknown 13830 1727204071.85547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.85656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.85678: variable 'omit' from source: magic vars 13830 1727204071.85692: starting attempt loop 13830 1727204071.85699: running the handler 13830 1727204071.85727: variable 'lsr_setup' from source: include params 13830 1727204071.85813: variable 'lsr_setup' from source: include params 13830 1727204071.85868: handler run complete 13830 1727204071.85897: attempt loop complete, returning result 13830 1727204071.85922: variable 'item' from source: unknown 13830 1727204071.85994: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 13830 1727204071.86184: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.86197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.86211: variable 'omit' from source: magic vars 13830 1727204071.86388: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.86400: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.86409: variable 'omit' from source: magic vars 13830 1727204071.86431: variable 'omit' from source: magic vars 13830 1727204071.86483: variable 'item' from source: unknown 13830 1727204071.86557: variable 'item' from source: unknown 13830 1727204071.86579: variable 'omit' from source: magic vars 13830 1727204071.86603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.86617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.86639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.87318: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.87331: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.87340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.87425: Set connection var ansible_connection to ssh 13830 1727204071.87442: Set connection var ansible_timeout to 10 13830 1727204071.87452: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.87459: Set connection var ansible_shell_type to sh 13830 1727204071.87470: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.87486: Set connection var ansible_pipelining to False 13830 1727204071.87516: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.87525: variable 'ansible_connection' from source: unknown 13830 1727204071.87535: variable 'ansible_module_compression' from source: unknown 13830 1727204071.87540: variable 'ansible_shell_type' from source: unknown 13830 1727204071.87545: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.87549: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.87555: variable 'ansible_pipelining' from source: unknown 13830 1727204071.87560: variable 'ansible_timeout' from source: unknown 13830 1727204071.87569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.87659: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.87675: variable 'omit' from source: magic vars 13830 1727204071.87684: starting attempt loop 13830 1727204071.87692: running the handler 13830 1727204071.87726: variable 'lsr_test' from source: include params 13830 1727204071.87805: variable 'lsr_test' from source: include params 13830 1727204071.87837: handler run complete 13830 1727204071.87856: attempt loop complete, returning result 13830 1727204071.87878: variable 'item' from source: unknown 13830 1727204071.87952: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile.yml" ] } 13830 1727204071.88126: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.88144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.88158: variable 'omit' from source: magic vars 13830 1727204071.88461: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.88476: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.88485: variable 'omit' from source: magic vars 13830 1727204071.88622: variable 'omit' from source: magic vars 13830 1727204071.88671: variable 'item' from source: unknown 13830 1727204071.88748: variable 'item' from source: unknown 13830 1727204071.88847: variable 'omit' from source: magic vars 13830 1727204071.88875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.88889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.88949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.88967: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.88976: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.88983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.89183: Set connection var ansible_connection to ssh 13830 1727204071.89197: Set connection var ansible_timeout to 10 13830 1727204071.89209: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.89216: Set connection var ansible_shell_type to sh 13830 1727204071.89224: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.89241: Set connection var ansible_pipelining to False 13830 1727204071.89385: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.89392: variable 'ansible_connection' from source: unknown 13830 1727204071.89398: variable 'ansible_module_compression' from source: unknown 13830 1727204071.89403: variable 'ansible_shell_type' from source: unknown 13830 1727204071.89408: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.89413: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.89420: variable 'ansible_pipelining' from source: unknown 13830 1727204071.89425: variable 'ansible_timeout' from source: unknown 13830 1727204071.89434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.89527: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.89600: variable 'omit' from source: magic vars 13830 1727204071.89609: starting attempt loop 13830 1727204071.89674: running the handler 13830 1727204071.89702: variable 'lsr_assert' from source: include params 13830 1727204071.89781: variable 'lsr_assert' from source: include params 13830 1727204071.89941: handler run complete 13830 1727204071.89959: attempt loop complete, returning result 13830 1727204071.89978: variable 'item' from source: unknown 13830 1727204071.90161: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_controller_device_present.yml", "tasks/assert_bond_port_profile_present.yml", "tasks/assert_bond_options.yml" ] } 13830 1727204071.90337: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.90495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.90516: variable 'omit' from source: magic vars 13830 1727204071.90693: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.90825: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.90837: variable 'omit' from source: magic vars 13830 1727204071.90857: variable 'omit' from source: magic vars 13830 1727204071.90970: variable 'item' from source: unknown 13830 1727204071.91153: variable 'item' from source: unknown 13830 1727204071.91176: variable 'omit' from source: magic vars 13830 1727204071.91200: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.91214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.91225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.91270: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.91309: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.91317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.91407: Set connection var ansible_connection to ssh 13830 1727204071.91421: Set connection var ansible_timeout to 10 13830 1727204071.91435: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.91442: Set connection var ansible_shell_type to sh 13830 1727204071.91452: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.91470: Set connection var ansible_pipelining to False 13830 1727204071.91499: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.91506: variable 'ansible_connection' from source: unknown 13830 1727204071.91513: variable 'ansible_module_compression' from source: unknown 13830 1727204071.91520: variable 'ansible_shell_type' from source: unknown 13830 1727204071.91531: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.91539: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.91547: variable 'ansible_pipelining' from source: unknown 13830 1727204071.91554: variable 'ansible_timeout' from source: unknown 13830 1727204071.91561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.91667: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.91681: variable 'omit' from source: magic vars 13830 1727204071.91696: starting attempt loop 13830 1727204071.91705: running the handler 13830 1727204071.91958: handler run complete 13830 1727204071.91978: attempt loop complete, returning result 13830 1727204071.91998: variable 'item' from source: unknown 13830 1727204071.92082: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 13830 1727204071.92251: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.92267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.92286: variable 'omit' from source: magic vars 13830 1727204071.92454: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.92467: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.92476: variable 'omit' from source: magic vars 13830 1727204071.92499: variable 'omit' from source: magic vars 13830 1727204071.92550: variable 'item' from source: unknown 13830 1727204071.92626: variable 'item' from source: unknown 13830 1727204071.92648: variable 'omit' from source: magic vars 13830 1727204071.92673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.92686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.92696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.92715: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.92722: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.92734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.92812: Set connection var ansible_connection to ssh 13830 1727204071.92833: Set connection var ansible_timeout to 10 13830 1727204071.92847: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.92854: Set connection var ansible_shell_type to sh 13830 1727204071.92865: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.92879: Set connection var ansible_pipelining to False 13830 1727204071.92904: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.92911: variable 'ansible_connection' from source: unknown 13830 1727204071.92918: variable 'ansible_module_compression' from source: unknown 13830 1727204071.92927: variable 'ansible_shell_type' from source: unknown 13830 1727204071.92940: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.92951: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.92959: variable 'ansible_pipelining' from source: unknown 13830 1727204071.92967: variable 'ansible_timeout' from source: unknown 13830 1727204071.92981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.93106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.93120: variable 'omit' from source: magic vars 13830 1727204071.93131: starting attempt loop 13830 1727204071.93139: running the handler 13830 1727204071.93171: variable 'lsr_fail_debug' from source: play vars 13830 1727204071.93283: variable 'lsr_fail_debug' from source: play vars 13830 1727204071.93308: handler run complete 13830 1727204071.93326: attempt loop complete, returning result 13830 1727204071.93346: variable 'item' from source: unknown 13830 1727204071.93441: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 13830 1727204071.93634: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.93648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.93667: variable 'omit' from source: magic vars 13830 1727204071.93828: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.93843: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.93851: variable 'omit' from source: magic vars 13830 1727204071.93875: variable 'omit' from source: magic vars 13830 1727204071.93932: variable 'item' from source: unknown 13830 1727204071.94005: variable 'item' from source: unknown 13830 1727204071.94025: variable 'omit' from source: magic vars 13830 1727204071.94051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204071.94066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.94083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204071.94105: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204071.94116: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.94124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.94204: Set connection var ansible_connection to ssh 13830 1727204071.94225: Set connection var ansible_timeout to 10 13830 1727204071.94240: Set connection var ansible_shell_executable to /bin/sh 13830 1727204071.94248: Set connection var ansible_shell_type to sh 13830 1727204071.94258: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204071.94274: Set connection var ansible_pipelining to False 13830 1727204071.94304: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.94315: variable 'ansible_connection' from source: unknown 13830 1727204071.94327: variable 'ansible_module_compression' from source: unknown 13830 1727204071.94340: variable 'ansible_shell_type' from source: unknown 13830 1727204071.94348: variable 'ansible_shell_executable' from source: unknown 13830 1727204071.94356: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.94366: variable 'ansible_pipelining' from source: unknown 13830 1727204071.94374: variable 'ansible_timeout' from source: unknown 13830 1727204071.94382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.94491: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204071.94504: variable 'omit' from source: magic vars 13830 1727204071.94513: starting attempt loop 13830 1727204071.94519: running the handler 13830 1727204071.94555: variable 'lsr_cleanup' from source: include params 13830 1727204071.94622: variable 'lsr_cleanup' from source: include params 13830 1727204071.94652: handler run complete 13830 1727204071.94677: attempt loop complete, returning result 13830 1727204071.94705: variable 'item' from source: unknown 13830 1727204071.94782: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml" ] } 13830 1727204071.94900: dumping result to json 13830 1727204071.94913: done dumping result, returning 13830 1727204071.94926: done running TaskExecutor() for managed-node3/TASK: Show item [0affcd87-79f5-1659-6b02-000000000089] 13830 1727204071.94940: sending task result for task 0affcd87-79f5-1659-6b02-000000000089 13830 1727204071.95120: no more pending results, returning what we have 13830 1727204071.95124: results queue empty 13830 1727204071.95125: checking for any_errors_fatal 13830 1727204071.95135: done checking for any_errors_fatal 13830 1727204071.95136: checking for max_fail_percentage 13830 1727204071.95137: done checking for max_fail_percentage 13830 1727204071.95138: checking to see if all hosts have failed and the running result is not ok 13830 1727204071.95139: done checking to see if all hosts have failed 13830 1727204071.95139: getting the remaining hosts for this loop 13830 1727204071.95141: done getting the remaining hosts for this loop 13830 1727204071.95145: getting the next task for host managed-node3 13830 1727204071.95150: done getting next task for host managed-node3 13830 1727204071.95153: ^ task is: TASK: Include the task 'show_interfaces.yml' 13830 1727204071.95156: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204071.95159: getting variables 13830 1727204071.95161: in VariableManager get_vars() 13830 1727204071.95204: Calling all_inventory to load vars for managed-node3 13830 1727204071.95209: Calling groups_inventory to load vars for managed-node3 13830 1727204071.95213: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.95225: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.95231: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.95234: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.95559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.95815: done with get_vars() 13830 1727204071.95827: done getting variables 13830 1727204071.95981: done sending task result for task 0affcd87-79f5-1659-6b02-000000000089 13830 1727204071.95984: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.154) 0:00:05.038 ***** 13830 1727204071.96069: entering _queue_task() for managed-node3/include_tasks 13830 1727204071.96522: worker is 1 (out of 1 available) 13830 1727204071.96536: exiting _queue_task() for managed-node3/include_tasks 13830 1727204071.96548: done queuing things up, now waiting for results queue to drain 13830 1727204071.96550: waiting for pending results... 13830 1727204071.96819: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 13830 1727204071.96932: in run() - task 0affcd87-79f5-1659-6b02-00000000008a 13830 1727204071.96967: variable 'ansible_search_path' from source: unknown 13830 1727204071.96976: variable 'ansible_search_path' from source: unknown 13830 1727204071.97020: calling self._execute() 13830 1727204071.97142: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204071.97155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204071.97177: variable 'omit' from source: magic vars 13830 1727204071.97615: variable 'ansible_distribution_major_version' from source: facts 13830 1727204071.97639: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204071.97650: _execute() done 13830 1727204071.97657: dumping result to json 13830 1727204071.97666: done dumping result, returning 13830 1727204071.97675: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1659-6b02-00000000008a] 13830 1727204071.97684: sending task result for task 0affcd87-79f5-1659-6b02-00000000008a 13830 1727204071.97822: no more pending results, returning what we have 13830 1727204071.97830: in VariableManager get_vars() 13830 1727204071.97871: Calling all_inventory to load vars for managed-node3 13830 1727204071.97874: Calling groups_inventory to load vars for managed-node3 13830 1727204071.97878: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204071.97892: Calling all_plugins_play to load vars for managed-node3 13830 1727204071.97895: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204071.97899: Calling groups_plugins_play to load vars for managed-node3 13830 1727204071.98107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204071.98314: done with get_vars() 13830 1727204071.98322: variable 'ansible_search_path' from source: unknown 13830 1727204071.98323: variable 'ansible_search_path' from source: unknown 13830 1727204071.98374: we have included files to process 13830 1727204071.98376: generating all_blocks data 13830 1727204071.98378: done generating all_blocks data 13830 1727204071.98386: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 13830 1727204071.98387: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 13830 1727204071.98390: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 13830 1727204071.98847: in VariableManager get_vars() 13830 1727204071.98869: done with get_vars() 13830 1727204071.98902: done sending task result for task 0affcd87-79f5-1659-6b02-00000000008a 13830 1727204071.98905: WORKER PROCESS EXITING 13830 1727204071.99110: done processing included file 13830 1727204071.99112: iterating over new_blocks loaded from include file 13830 1727204071.99114: in VariableManager get_vars() 13830 1727204071.99127: done with get_vars() 13830 1727204071.99132: filtering new block on tags 13830 1727204071.99224: done filtering new block on tags 13830 1727204071.99227: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 13830 1727204071.99235: extending task lists for all hosts with included blocks 13830 1727204072.00184: done extending task lists 13830 1727204072.00186: done processing included files 13830 1727204072.00186: results queue empty 13830 1727204072.00187: checking for any_errors_fatal 13830 1727204072.00193: done checking for any_errors_fatal 13830 1727204072.00194: checking for max_fail_percentage 13830 1727204072.00195: done checking for max_fail_percentage 13830 1727204072.00196: checking to see if all hosts have failed and the running result is not ok 13830 1727204072.00197: done checking to see if all hosts have failed 13830 1727204072.00197: getting the remaining hosts for this loop 13830 1727204072.00199: done getting the remaining hosts for this loop 13830 1727204072.00202: getting the next task for host managed-node3 13830 1727204072.00207: done getting next task for host managed-node3 13830 1727204072.00209: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 13830 1727204072.00212: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204072.00214: getting variables 13830 1727204072.00215: in VariableManager get_vars() 13830 1727204072.00225: Calling all_inventory to load vars for managed-node3 13830 1727204072.00344: Calling groups_inventory to load vars for managed-node3 13830 1727204072.00347: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.00353: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.00356: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.00359: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.00803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.01234: done with get_vars() 13830 1727204072.01246: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.053) 0:00:05.092 ***** 13830 1727204072.01470: entering _queue_task() for managed-node3/include_tasks 13830 1727204072.01813: worker is 1 (out of 1 available) 13830 1727204072.01825: exiting _queue_task() for managed-node3/include_tasks 13830 1727204072.01839: done queuing things up, now waiting for results queue to drain 13830 1727204072.01840: waiting for pending results... 13830 1727204072.03481: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 13830 1727204072.03909: in run() - task 0affcd87-79f5-1659-6b02-0000000000b1 13830 1727204072.03914: variable 'ansible_search_path' from source: unknown 13830 1727204072.03917: variable 'ansible_search_path' from source: unknown 13830 1727204072.04096: calling self._execute() 13830 1727204072.04304: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.04322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.04338: variable 'omit' from source: magic vars 13830 1727204072.05151: variable 'ansible_distribution_major_version' from source: facts 13830 1727204072.05254: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204072.05315: _execute() done 13830 1727204072.05323: dumping result to json 13830 1727204072.05330: done dumping result, returning 13830 1727204072.05341: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1659-6b02-0000000000b1] 13830 1727204072.05388: sending task result for task 0affcd87-79f5-1659-6b02-0000000000b1 13830 1727204072.05648: no more pending results, returning what we have 13830 1727204072.05655: in VariableManager get_vars() 13830 1727204072.05716: Calling all_inventory to load vars for managed-node3 13830 1727204072.05720: Calling groups_inventory to load vars for managed-node3 13830 1727204072.05724: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.06579: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.06584: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.06588: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.06812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.07205: done with get_vars() 13830 1727204072.07215: variable 'ansible_search_path' from source: unknown 13830 1727204072.07216: variable 'ansible_search_path' from source: unknown 13830 1727204072.07255: we have included files to process 13830 1727204072.07257: generating all_blocks data 13830 1727204072.07259: done generating all_blocks data 13830 1727204072.07260: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 13830 1727204072.07261: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 13830 1727204072.07266: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 13830 1727204072.07883: done sending task result for task 0affcd87-79f5-1659-6b02-0000000000b1 13830 1727204072.07886: WORKER PROCESS EXITING 13830 1727204072.08645: done processing included file 13830 1727204072.08647: iterating over new_blocks loaded from include file 13830 1727204072.08649: in VariableManager get_vars() 13830 1727204072.08667: done with get_vars() 13830 1727204072.08669: filtering new block on tags 13830 1727204072.08757: done filtering new block on tags 13830 1727204072.08760: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 13830 1727204072.08769: extending task lists for all hosts with included blocks 13830 1727204072.08941: done extending task lists 13830 1727204072.08943: done processing included files 13830 1727204072.08943: results queue empty 13830 1727204072.08944: checking for any_errors_fatal 13830 1727204072.08948: done checking for any_errors_fatal 13830 1727204072.08948: checking for max_fail_percentage 13830 1727204072.08949: done checking for max_fail_percentage 13830 1727204072.08950: checking to see if all hosts have failed and the running result is not ok 13830 1727204072.08951: done checking to see if all hosts have failed 13830 1727204072.08952: getting the remaining hosts for this loop 13830 1727204072.08953: done getting the remaining hosts for this loop 13830 1727204072.08956: getting the next task for host managed-node3 13830 1727204072.08960: done getting next task for host managed-node3 13830 1727204072.08962: ^ task is: TASK: Gather current interface info 13830 1727204072.08969: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204072.08972: getting variables 13830 1727204072.08973: in VariableManager get_vars() 13830 1727204072.08982: Calling all_inventory to load vars for managed-node3 13830 1727204072.08985: Calling groups_inventory to load vars for managed-node3 13830 1727204072.08987: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.08993: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.08995: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.08998: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.09151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.09369: done with get_vars() 13830 1727204072.09378: done getting variables 13830 1727204072.09426: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.079) 0:00:05.172 ***** 13830 1727204072.09459: entering _queue_task() for managed-node3/command 13830 1727204072.09785: worker is 1 (out of 1 available) 13830 1727204072.09798: exiting _queue_task() for managed-node3/command 13830 1727204072.09811: done queuing things up, now waiting for results queue to drain 13830 1727204072.09813: waiting for pending results... 13830 1727204072.10097: running TaskExecutor() for managed-node3/TASK: Gather current interface info 13830 1727204072.10433: in run() - task 0affcd87-79f5-1659-6b02-0000000000ec 13830 1727204072.10453: variable 'ansible_search_path' from source: unknown 13830 1727204072.10461: variable 'ansible_search_path' from source: unknown 13830 1727204072.10504: calling self._execute() 13830 1727204072.10671: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.10684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.10699: variable 'omit' from source: magic vars 13830 1727204072.11179: variable 'ansible_distribution_major_version' from source: facts 13830 1727204072.11199: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204072.11210: variable 'omit' from source: magic vars 13830 1727204072.11265: variable 'omit' from source: magic vars 13830 1727204072.11309: variable 'omit' from source: magic vars 13830 1727204072.11353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204072.11402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204072.11430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204072.11453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.11472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.11519: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204072.11522: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.11525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.11629: Set connection var ansible_connection to ssh 13830 1727204072.11646: Set connection var ansible_timeout to 10 13830 1727204072.11674: Set connection var ansible_shell_executable to /bin/sh 13830 1727204072.11685: Set connection var ansible_shell_type to sh 13830 1727204072.11692: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204072.11732: Set connection var ansible_pipelining to False 13830 1727204072.11739: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.11745: variable 'ansible_connection' from source: unknown 13830 1727204072.11752: variable 'ansible_module_compression' from source: unknown 13830 1727204072.11755: variable 'ansible_shell_type' from source: unknown 13830 1727204072.11757: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.11760: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.11766: variable 'ansible_pipelining' from source: unknown 13830 1727204072.11768: variable 'ansible_timeout' from source: unknown 13830 1727204072.11772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.11889: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204072.11897: variable 'omit' from source: magic vars 13830 1727204072.11902: starting attempt loop 13830 1727204072.11904: running the handler 13830 1727204072.11916: _low_level_execute_command(): starting 13830 1727204072.11923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204072.12434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.12452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.12477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.12493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.12532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.12549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.12602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204072.14797: stdout chunk (state=3): >>>/root <<< 13830 1727204072.14986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204072.15041: stderr chunk (state=3): >>><<< 13830 1727204072.15093: stdout chunk (state=3): >>><<< 13830 1727204072.15125: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204072.15189: _low_level_execute_command(): starting 13830 1727204072.15205: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232 `" && echo ansible-tmp-1727204072.1517096-14382-268688880480232="` echo /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232 `" ) && sleep 0' 13830 1727204072.15858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204072.15886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.15902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.15922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.15969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.15988: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204072.16015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.16035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204072.16047: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204072.16058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204072.16098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.16170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.16195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204072.16216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.16299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13830 1727204072.18880: stdout chunk (state=3): >>>ansible-tmp-1727204072.1517096-14382-268688880480232=/root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232 <<< 13830 1727204072.19152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204072.19156: stdout chunk (state=3): >>><<< 13830 1727204072.19159: stderr chunk (state=3): >>><<< 13830 1727204072.19571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204072.1517096-14382-268688880480232=/root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13830 1727204072.19575: variable 'ansible_module_compression' from source: unknown 13830 1727204072.19578: ANSIBALLZ: Using generic lock for ansible.legacy.command 13830 1727204072.19580: ANSIBALLZ: Acquiring lock 13830 1727204072.19582: ANSIBALLZ: Lock acquired: 140043657885840 13830 1727204072.19584: ANSIBALLZ: Creating module 13830 1727204072.39719: ANSIBALLZ: Writing module into payload 13830 1727204072.40294: ANSIBALLZ: Writing module 13830 1727204072.40328: ANSIBALLZ: Renaming module 13830 1727204072.40339: ANSIBALLZ: Done creating module 13830 1727204072.40360: variable 'ansible_facts' from source: unknown 13830 1727204072.40444: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232/AnsiballZ_command.py 13830 1727204072.42211: Sending initial data 13830 1727204072.42215: Sent initial data (156 bytes) 13830 1727204072.44111: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204072.45090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.45109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.45129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.45180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.45192: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204072.45205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.45223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204072.45235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204072.45248: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204072.45259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.45275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.45291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.45302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.45311: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204072.45323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.45393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.45410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204072.45423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.45496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204072.47204: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204072.47238: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204072.47279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmphle26ky0 /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232/AnsiballZ_command.py <<< 13830 1727204072.47316: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204072.48696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204072.48819: stderr chunk (state=3): >>><<< 13830 1727204072.48822: stdout chunk (state=3): >>><<< 13830 1727204072.48824: done transferring module to remote 13830 1727204072.48826: _low_level_execute_command(): starting 13830 1727204072.48829: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232/ /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232/AnsiballZ_command.py && sleep 0' 13830 1727204072.50128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204072.50288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.50303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.50326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.50377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.50480: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204072.50499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.50518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204072.50532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204072.50543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204072.50556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.50572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.50589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.50603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.50616: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204072.50629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.50702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.50730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204072.50838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.50945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204072.52696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204072.52743: stderr chunk (state=3): >>><<< 13830 1727204072.52746: stdout chunk (state=3): >>><<< 13830 1727204072.52848: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204072.52852: _low_level_execute_command(): starting 13830 1727204072.52854: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232/AnsiballZ_command.py && sleep 0' 13830 1727204072.54318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204072.54333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.54348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.54370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.54415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.54429: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204072.54444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.54488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204072.54501: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204072.54513: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204072.54528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.54542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.54557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.54606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.54618: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204072.54633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.54748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.54785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204072.54800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.54933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204072.68212: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:54:32.677890", "end": "2024-09-24 14:54:32.681128", "delta": "0:00:00.003238", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204072.69448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204072.69453: stdout chunk (state=3): >>><<< 13830 1727204072.69455: stderr chunk (state=3): >>><<< 13830 1727204072.69604: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:54:32.677890", "end": "2024-09-24 14:54:32.681128", "delta": "0:00:00.003238", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204072.69613: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204072.69616: _low_level_execute_command(): starting 13830 1727204072.69618: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204072.1517096-14382-268688880480232/ > /dev/null 2>&1 && sleep 0' 13830 1727204072.70169: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204072.70184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.70199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.70218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.70260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.70274: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204072.70288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.70306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204072.70317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204072.70328: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204072.70341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.70356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.70375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.70387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204072.70397: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204072.70410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.70491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.70519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204072.70536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.70607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204072.72363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204072.72443: stderr chunk (state=3): >>><<< 13830 1727204072.72446: stdout chunk (state=3): >>><<< 13830 1727204072.72676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204072.72679: handler run complete 13830 1727204072.72682: Evaluated conditional (False): False 13830 1727204072.72684: attempt loop complete, returning result 13830 1727204072.72685: _execute() done 13830 1727204072.72687: dumping result to json 13830 1727204072.72689: done dumping result, returning 13830 1727204072.72690: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-1659-6b02-0000000000ec] 13830 1727204072.72692: sending task result for task 0affcd87-79f5-1659-6b02-0000000000ec 13830 1727204072.72842: done sending task result for task 0affcd87-79f5-1659-6b02-0000000000ec 13830 1727204072.72845: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003238", "end": "2024-09-24 14:54:32.681128", "rc": 0, "start": "2024-09-24 14:54:32.677890" } STDOUT: bonding_masters eth0 lo 13830 1727204072.72923: no more pending results, returning what we have 13830 1727204072.72928: results queue empty 13830 1727204072.72929: checking for any_errors_fatal 13830 1727204072.72930: done checking for any_errors_fatal 13830 1727204072.72931: checking for max_fail_percentage 13830 1727204072.72933: done checking for max_fail_percentage 13830 1727204072.72934: checking to see if all hosts have failed and the running result is not ok 13830 1727204072.72934: done checking to see if all hosts have failed 13830 1727204072.72935: getting the remaining hosts for this loop 13830 1727204072.72937: done getting the remaining hosts for this loop 13830 1727204072.72941: getting the next task for host managed-node3 13830 1727204072.72948: done getting next task for host managed-node3 13830 1727204072.72951: ^ task is: TASK: Set current_interfaces 13830 1727204072.72958: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204072.72961: getting variables 13830 1727204072.72963: in VariableManager get_vars() 13830 1727204072.72989: Calling all_inventory to load vars for managed-node3 13830 1727204072.72991: Calling groups_inventory to load vars for managed-node3 13830 1727204072.72995: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.73006: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.73008: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.73016: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.73370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.73734: done with get_vars() 13830 1727204072.73747: done getting variables 13830 1727204072.73795: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.643) 0:00:05.816 ***** 13830 1727204072.73818: entering _queue_task() for managed-node3/set_fact 13830 1727204072.74038: worker is 1 (out of 1 available) 13830 1727204072.74051: exiting _queue_task() for managed-node3/set_fact 13830 1727204072.74065: done queuing things up, now waiting for results queue to drain 13830 1727204072.74067: waiting for pending results... 13830 1727204072.74216: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 13830 1727204072.74289: in run() - task 0affcd87-79f5-1659-6b02-0000000000ed 13830 1727204072.74299: variable 'ansible_search_path' from source: unknown 13830 1727204072.74306: variable 'ansible_search_path' from source: unknown 13830 1727204072.74334: calling self._execute() 13830 1727204072.74394: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.74398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.74408: variable 'omit' from source: magic vars 13830 1727204072.74681: variable 'ansible_distribution_major_version' from source: facts 13830 1727204072.74692: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204072.74697: variable 'omit' from source: magic vars 13830 1727204072.74736: variable 'omit' from source: magic vars 13830 1727204072.74812: variable '_current_interfaces' from source: set_fact 13830 1727204072.74861: variable 'omit' from source: magic vars 13830 1727204072.74894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204072.74920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204072.74939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204072.74953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.74963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.74987: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204072.74990: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.74993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.75058: Set connection var ansible_connection to ssh 13830 1727204072.75071: Set connection var ansible_timeout to 10 13830 1727204072.75076: Set connection var ansible_shell_executable to /bin/sh 13830 1727204072.75079: Set connection var ansible_shell_type to sh 13830 1727204072.75084: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204072.75091: Set connection var ansible_pipelining to False 13830 1727204072.75108: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.75111: variable 'ansible_connection' from source: unknown 13830 1727204072.75113: variable 'ansible_module_compression' from source: unknown 13830 1727204072.75115: variable 'ansible_shell_type' from source: unknown 13830 1727204072.75118: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.75120: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.75122: variable 'ansible_pipelining' from source: unknown 13830 1727204072.75125: variable 'ansible_timeout' from source: unknown 13830 1727204072.75132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.75234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204072.75240: variable 'omit' from source: magic vars 13830 1727204072.75245: starting attempt loop 13830 1727204072.75248: running the handler 13830 1727204072.75258: handler run complete 13830 1727204072.75269: attempt loop complete, returning result 13830 1727204072.75272: _execute() done 13830 1727204072.75276: dumping result to json 13830 1727204072.75278: done dumping result, returning 13830 1727204072.75281: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-1659-6b02-0000000000ed] 13830 1727204072.75286: sending task result for task 0affcd87-79f5-1659-6b02-0000000000ed 13830 1727204072.75365: done sending task result for task 0affcd87-79f5-1659-6b02-0000000000ed 13830 1727204072.75369: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 13830 1727204072.75444: no more pending results, returning what we have 13830 1727204072.75447: results queue empty 13830 1727204072.75448: checking for any_errors_fatal 13830 1727204072.75460: done checking for any_errors_fatal 13830 1727204072.75461: checking for max_fail_percentage 13830 1727204072.75462: done checking for max_fail_percentage 13830 1727204072.75465: checking to see if all hosts have failed and the running result is not ok 13830 1727204072.75466: done checking to see if all hosts have failed 13830 1727204072.75467: getting the remaining hosts for this loop 13830 1727204072.75468: done getting the remaining hosts for this loop 13830 1727204072.75472: getting the next task for host managed-node3 13830 1727204072.75478: done getting next task for host managed-node3 13830 1727204072.75484: ^ task is: TASK: Show current_interfaces 13830 1727204072.75488: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204072.75496: getting variables 13830 1727204072.75504: in VariableManager get_vars() 13830 1727204072.75592: Calling all_inventory to load vars for managed-node3 13830 1727204072.75594: Calling groups_inventory to load vars for managed-node3 13830 1727204072.75596: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.75603: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.75604: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.75606: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.75794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.76050: done with get_vars() 13830 1727204072.76059: done getting variables 13830 1727204072.76119: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.023) 0:00:05.839 ***** 13830 1727204072.76158: entering _queue_task() for managed-node3/debug 13830 1727204072.76427: worker is 1 (out of 1 available) 13830 1727204072.76441: exiting _queue_task() for managed-node3/debug 13830 1727204072.76459: done queuing things up, now waiting for results queue to drain 13830 1727204072.76461: waiting for pending results... 13830 1727204072.76731: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 13830 1727204072.76846: in run() - task 0affcd87-79f5-1659-6b02-0000000000b2 13830 1727204072.76867: variable 'ansible_search_path' from source: unknown 13830 1727204072.76879: variable 'ansible_search_path' from source: unknown 13830 1727204072.76924: calling self._execute() 13830 1727204072.77027: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.77040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.77052: variable 'omit' from source: magic vars 13830 1727204072.77453: variable 'ansible_distribution_major_version' from source: facts 13830 1727204072.77478: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204072.77490: variable 'omit' from source: magic vars 13830 1727204072.77539: variable 'omit' from source: magic vars 13830 1727204072.77817: variable 'current_interfaces' from source: set_fact 13830 1727204072.77822: variable 'omit' from source: magic vars 13830 1727204072.77825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204072.77830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204072.77833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204072.77835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.77838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.77840: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204072.77843: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.77845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.77894: Set connection var ansible_connection to ssh 13830 1727204072.77903: Set connection var ansible_timeout to 10 13830 1727204072.77908: Set connection var ansible_shell_executable to /bin/sh 13830 1727204072.77939: Set connection var ansible_shell_type to sh 13830 1727204072.77943: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204072.77945: Set connection var ansible_pipelining to False 13830 1727204072.77952: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.77956: variable 'ansible_connection' from source: unknown 13830 1727204072.77959: variable 'ansible_module_compression' from source: unknown 13830 1727204072.77961: variable 'ansible_shell_type' from source: unknown 13830 1727204072.77963: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.77967: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.77969: variable 'ansible_pipelining' from source: unknown 13830 1727204072.77973: variable 'ansible_timeout' from source: unknown 13830 1727204072.77983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.78339: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204072.78342: variable 'omit' from source: magic vars 13830 1727204072.78344: starting attempt loop 13830 1727204072.78346: running the handler 13830 1727204072.78347: handler run complete 13830 1727204072.78349: attempt loop complete, returning result 13830 1727204072.78351: _execute() done 13830 1727204072.78352: dumping result to json 13830 1727204072.78354: done dumping result, returning 13830 1727204072.78356: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-1659-6b02-0000000000b2] 13830 1727204072.78358: sending task result for task 0affcd87-79f5-1659-6b02-0000000000b2 13830 1727204072.78426: done sending task result for task 0affcd87-79f5-1659-6b02-0000000000b2 13830 1727204072.78428: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 13830 1727204072.78477: no more pending results, returning what we have 13830 1727204072.78480: results queue empty 13830 1727204072.78481: checking for any_errors_fatal 13830 1727204072.78486: done checking for any_errors_fatal 13830 1727204072.78486: checking for max_fail_percentage 13830 1727204072.78488: done checking for max_fail_percentage 13830 1727204072.78488: checking to see if all hosts have failed and the running result is not ok 13830 1727204072.78489: done checking to see if all hosts have failed 13830 1727204072.78490: getting the remaining hosts for this loop 13830 1727204072.78491: done getting the remaining hosts for this loop 13830 1727204072.78495: getting the next task for host managed-node3 13830 1727204072.78501: done getting next task for host managed-node3 13830 1727204072.78504: ^ task is: TASK: Setup 13830 1727204072.78506: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204072.78510: getting variables 13830 1727204072.78511: in VariableManager get_vars() 13830 1727204072.78538: Calling all_inventory to load vars for managed-node3 13830 1727204072.78541: Calling groups_inventory to load vars for managed-node3 13830 1727204072.78544: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.78552: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.78554: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.78557: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.78729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.78919: done with get_vars() 13830 1727204072.78930: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.028) 0:00:05.868 ***** 13830 1727204072.79016: entering _queue_task() for managed-node3/include_tasks 13830 1727204072.79419: worker is 1 (out of 1 available) 13830 1727204072.79435: exiting _queue_task() for managed-node3/include_tasks 13830 1727204072.79446: done queuing things up, now waiting for results queue to drain 13830 1727204072.79448: waiting for pending results... 13830 1727204072.79604: running TaskExecutor() for managed-node3/TASK: Setup 13830 1727204072.79667: in run() - task 0affcd87-79f5-1659-6b02-00000000008b 13830 1727204072.79677: variable 'ansible_search_path' from source: unknown 13830 1727204072.79680: variable 'ansible_search_path' from source: unknown 13830 1727204072.79717: variable 'lsr_setup' from source: include params 13830 1727204072.79869: variable 'lsr_setup' from source: include params 13830 1727204072.79920: variable 'omit' from source: magic vars 13830 1727204072.80020: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.80033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.80045: variable 'omit' from source: magic vars 13830 1727204072.80260: variable 'ansible_distribution_major_version' from source: facts 13830 1727204072.80278: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204072.80287: variable 'item' from source: unknown 13830 1727204072.80350: variable 'item' from source: unknown 13830 1727204072.80388: variable 'item' from source: unknown 13830 1727204072.80446: variable 'item' from source: unknown 13830 1727204072.80614: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.80625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.80636: variable 'omit' from source: magic vars 13830 1727204072.80836: variable 'ansible_distribution_major_version' from source: facts 13830 1727204072.80845: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204072.80853: variable 'item' from source: unknown 13830 1727204072.80911: variable 'item' from source: unknown 13830 1727204072.80943: variable 'item' from source: unknown 13830 1727204072.81002: variable 'item' from source: unknown 13830 1727204072.81084: dumping result to json 13830 1727204072.81094: done dumping result, returning 13830 1727204072.81108: done running TaskExecutor() for managed-node3/TASK: Setup [0affcd87-79f5-1659-6b02-00000000008b] 13830 1727204072.81143: sending task result for task 0affcd87-79f5-1659-6b02-00000000008b 13830 1727204072.81238: no more pending results, returning what we have 13830 1727204072.81244: in VariableManager get_vars() 13830 1727204072.81286: Calling all_inventory to load vars for managed-node3 13830 1727204072.81289: Calling groups_inventory to load vars for managed-node3 13830 1727204072.81294: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.81306: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.81309: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.81312: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.81469: done sending task result for task 0affcd87-79f5-1659-6b02-00000000008b 13830 1727204072.81472: WORKER PROCESS EXITING 13830 1727204072.81541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.81718: done with get_vars() 13830 1727204072.81725: variable 'ansible_search_path' from source: unknown 13830 1727204072.81726: variable 'ansible_search_path' from source: unknown 13830 1727204072.81767: variable 'ansible_search_path' from source: unknown 13830 1727204072.81768: variable 'ansible_search_path' from source: unknown 13830 1727204072.81794: we have included files to process 13830 1727204072.81795: generating all_blocks data 13830 1727204072.81797: done generating all_blocks data 13830 1727204072.81801: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13830 1727204072.81802: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13830 1727204072.81804: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13830 1727204072.82989: done processing included file 13830 1727204072.82991: iterating over new_blocks loaded from include file 13830 1727204072.82993: in VariableManager get_vars() 13830 1727204072.83007: done with get_vars() 13830 1727204072.83009: filtering new block on tags 13830 1727204072.83099: done filtering new block on tags 13830 1727204072.83102: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed-node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 13830 1727204072.83107: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 13830 1727204072.83108: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 13830 1727204072.83112: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 13830 1727204072.83320: in VariableManager get_vars() 13830 1727204072.83334: done with get_vars() 13830 1727204072.83345: variable 'item' from source: include params 13830 1727204072.83446: variable 'item' from source: include params 13830 1727204072.83475: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13830 1727204072.83722: in VariableManager get_vars() 13830 1727204072.83738: done with get_vars() 13830 1727204072.83844: in VariableManager get_vars() 13830 1727204072.83856: done with get_vars() 13830 1727204072.83860: variable 'item' from source: include params 13830 1727204072.83909: variable 'item' from source: include params 13830 1727204072.83934: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13830 1727204072.83986: in VariableManager get_vars() 13830 1727204072.83998: done with get_vars() 13830 1727204072.84072: done processing included file 13830 1727204072.84073: iterating over new_blocks loaded from include file 13830 1727204072.84074: in VariableManager get_vars() 13830 1727204072.84082: done with get_vars() 13830 1727204072.84084: filtering new block on tags 13830 1727204072.84127: done filtering new block on tags 13830 1727204072.84132: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed-node3 => (item=tasks/assert_dhcp_device_present.yml) 13830 1727204072.84136: extending task lists for all hosts with included blocks 13830 1727204072.84481: done extending task lists 13830 1727204072.84482: done processing included files 13830 1727204072.84483: results queue empty 13830 1727204072.84483: checking for any_errors_fatal 13830 1727204072.84485: done checking for any_errors_fatal 13830 1727204072.84486: checking for max_fail_percentage 13830 1727204072.84486: done checking for max_fail_percentage 13830 1727204072.84487: checking to see if all hosts have failed and the running result is not ok 13830 1727204072.84488: done checking to see if all hosts have failed 13830 1727204072.84493: getting the remaining hosts for this loop 13830 1727204072.84494: done getting the remaining hosts for this loop 13830 1727204072.84495: getting the next task for host managed-node3 13830 1727204072.84498: done getting next task for host managed-node3 13830 1727204072.84499: ^ task is: TASK: Install dnsmasq 13830 1727204072.84502: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204072.84503: getting variables 13830 1727204072.84504: in VariableManager get_vars() 13830 1727204072.84510: Calling all_inventory to load vars for managed-node3 13830 1727204072.84511: Calling groups_inventory to load vars for managed-node3 13830 1727204072.84513: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204072.84517: Calling all_plugins_play to load vars for managed-node3 13830 1727204072.84518: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204072.84520: Calling groups_plugins_play to load vars for managed-node3 13830 1727204072.84623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204072.84737: done with get_vars() 13830 1727204072.84744: done getting variables 13830 1727204072.84773: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:32 -0400 (0:00:00.057) 0:00:05.926 ***** 13830 1727204072.84795: entering _queue_task() for managed-node3/package 13830 1727204072.85018: worker is 1 (out of 1 available) 13830 1727204072.85034: exiting _queue_task() for managed-node3/package 13830 1727204072.85048: done queuing things up, now waiting for results queue to drain 13830 1727204072.85050: waiting for pending results... 13830 1727204072.85207: running TaskExecutor() for managed-node3/TASK: Install dnsmasq 13830 1727204072.85280: in run() - task 0affcd87-79f5-1659-6b02-000000000112 13830 1727204072.85290: variable 'ansible_search_path' from source: unknown 13830 1727204072.85293: variable 'ansible_search_path' from source: unknown 13830 1727204072.85320: calling self._execute() 13830 1727204072.85382: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.85387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.85395: variable 'omit' from source: magic vars 13830 1727204072.85661: variable 'ansible_distribution_major_version' from source: facts 13830 1727204072.85679: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204072.85684: variable 'omit' from source: magic vars 13830 1727204072.85718: variable 'omit' from source: magic vars 13830 1727204072.85855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204072.87431: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204072.87481: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204072.87507: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204072.87539: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204072.87561: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204072.87646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204072.87669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204072.87692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204072.87719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204072.87738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204072.87811: variable '__network_is_ostree' from source: set_fact 13830 1727204072.87815: variable 'omit' from source: magic vars 13830 1727204072.87846: variable 'omit' from source: magic vars 13830 1727204072.87871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204072.87891: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204072.87906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204072.87919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.87928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204072.87955: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204072.87958: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.87961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.88028: Set connection var ansible_connection to ssh 13830 1727204072.88039: Set connection var ansible_timeout to 10 13830 1727204072.88045: Set connection var ansible_shell_executable to /bin/sh 13830 1727204072.88048: Set connection var ansible_shell_type to sh 13830 1727204072.88052: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204072.88064: Set connection var ansible_pipelining to False 13830 1727204072.88084: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.88087: variable 'ansible_connection' from source: unknown 13830 1727204072.88090: variable 'ansible_module_compression' from source: unknown 13830 1727204072.88092: variable 'ansible_shell_type' from source: unknown 13830 1727204072.88094: variable 'ansible_shell_executable' from source: unknown 13830 1727204072.88096: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204072.88098: variable 'ansible_pipelining' from source: unknown 13830 1727204072.88101: variable 'ansible_timeout' from source: unknown 13830 1727204072.88106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204072.88179: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204072.88188: variable 'omit' from source: magic vars 13830 1727204072.88193: starting attempt loop 13830 1727204072.88195: running the handler 13830 1727204072.88201: variable 'ansible_facts' from source: unknown 13830 1727204072.88203: variable 'ansible_facts' from source: unknown 13830 1727204072.88246: _low_level_execute_command(): starting 13830 1727204072.88252: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204072.88773: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204072.88789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.88805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204072.88826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.88876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.88889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.88944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204072.90573: stdout chunk (state=3): >>>/root <<< 13830 1727204072.90670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204072.90732: stderr chunk (state=3): >>><<< 13830 1727204072.90737: stdout chunk (state=3): >>><<< 13830 1727204072.90761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204072.90774: _low_level_execute_command(): starting 13830 1727204072.90779: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891 `" && echo ansible-tmp-1727204072.907608-14468-31906917560891="` echo /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891 `" ) && sleep 0' 13830 1727204072.91276: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204072.91296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.91309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.91320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204072.91333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204072.91375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204072.91387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204072.91439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204072.93259: stdout chunk (state=3): >>>ansible-tmp-1727204072.907608-14468-31906917560891=/root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891 <<< 13830 1727204072.93374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204072.93427: stderr chunk (state=3): >>><<< 13830 1727204072.93432: stdout chunk (state=3): >>><<< 13830 1727204072.93454: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204072.907608-14468-31906917560891=/root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204072.93481: variable 'ansible_module_compression' from source: unknown 13830 1727204072.93536: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 13830 1727204072.93540: ANSIBALLZ: Acquiring lock 13830 1727204072.93544: ANSIBALLZ: Lock acquired: 140043657885840 13830 1727204072.93546: ANSIBALLZ: Creating module 13830 1727204073.06741: ANSIBALLZ: Writing module into payload 13830 1727204073.06937: ANSIBALLZ: Writing module 13830 1727204073.06960: ANSIBALLZ: Renaming module 13830 1727204073.06972: ANSIBALLZ: Done creating module 13830 1727204073.06985: variable 'ansible_facts' from source: unknown 13830 1727204073.07044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891/AnsiballZ_dnf.py 13830 1727204073.07154: Sending initial data 13830 1727204073.07160: Sent initial data (150 bytes) 13830 1727204073.07874: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204073.07888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204073.07908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204073.07925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204073.07945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204073.07979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204073.07991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204073.08053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204073.09938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204073.09963: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204073.10012: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpbg0866f5 /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891/AnsiballZ_dnf.py <<< 13830 1727204073.10054: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204073.11402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204073.11511: stderr chunk (state=3): >>><<< 13830 1727204073.11517: stdout chunk (state=3): >>><<< 13830 1727204073.11549: done transferring module to remote 13830 1727204073.11560: _low_level_execute_command(): starting 13830 1727204073.11568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891/ /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891/AnsiballZ_dnf.py && sleep 0' 13830 1727204073.12186: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 13830 1727204073.12195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204073.12198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204073.12261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204073.14051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204073.14056: stdout chunk (state=3): >>><<< 13830 1727204073.14061: stderr chunk (state=3): >>><<< 13830 1727204073.14080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204073.14083: _low_level_execute_command(): starting 13830 1727204073.14089: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891/AnsiballZ_dnf.py && sleep 0' 13830 1727204073.14734: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204073.14744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204073.14754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204073.14768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204073.14807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204073.14814: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204073.14823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204073.14839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204073.14847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204073.14853: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204073.14861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204073.14873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204073.14884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204073.14892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204073.14898: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204073.14906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204073.14984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204073.15002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204073.15015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204073.15093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204074.07589: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13830 1727204074.11803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204074.11807: stdout chunk (state=3): >>><<< 13830 1727204074.11810: stderr chunk (state=3): >>><<< 13830 1727204074.11955: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204074.11959: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204074.11961: _low_level_execute_command(): starting 13830 1727204074.11971: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204072.907608-14468-31906917560891/ > /dev/null 2>&1 && sleep 0' 13830 1727204074.12606: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.12610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.12655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.12659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.12661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.12709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204074.12727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204074.12730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204074.12775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204074.14537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204074.14607: stderr chunk (state=3): >>><<< 13830 1727204074.14610: stdout chunk (state=3): >>><<< 13830 1727204074.14636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204074.14981: handler run complete 13830 1727204074.14984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204074.14987: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204074.15022: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204074.15059: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204074.15087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204074.15161: variable '__install_status' from source: unknown 13830 1727204074.15211: Evaluated conditional (__install_status is success): True 13830 1727204074.15224: attempt loop complete, returning result 13830 1727204074.15227: _execute() done 13830 1727204074.15229: dumping result to json 13830 1727204074.15231: done dumping result, returning 13830 1727204074.15243: done running TaskExecutor() for managed-node3/TASK: Install dnsmasq [0affcd87-79f5-1659-6b02-000000000112] 13830 1727204074.15246: sending task result for task 0affcd87-79f5-1659-6b02-000000000112 13830 1727204074.15353: done sending task result for task 0affcd87-79f5-1659-6b02-000000000112 13830 1727204074.15356: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13830 1727204074.15437: no more pending results, returning what we have 13830 1727204074.15441: results queue empty 13830 1727204074.15442: checking for any_errors_fatal 13830 1727204074.15444: done checking for any_errors_fatal 13830 1727204074.15444: checking for max_fail_percentage 13830 1727204074.15446: done checking for max_fail_percentage 13830 1727204074.15447: checking to see if all hosts have failed and the running result is not ok 13830 1727204074.15447: done checking to see if all hosts have failed 13830 1727204074.15448: getting the remaining hosts for this loop 13830 1727204074.15449: done getting the remaining hosts for this loop 13830 1727204074.15453: getting the next task for host managed-node3 13830 1727204074.15459: done getting next task for host managed-node3 13830 1727204074.15461: ^ task is: TASK: Install pgrep, sysctl 13830 1727204074.15471: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204074.15474: getting variables 13830 1727204074.15475: in VariableManager get_vars() 13830 1727204074.15504: Calling all_inventory to load vars for managed-node3 13830 1727204074.15507: Calling groups_inventory to load vars for managed-node3 13830 1727204074.15510: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204074.15520: Calling all_plugins_play to load vars for managed-node3 13830 1727204074.15522: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204074.15524: Calling groups_plugins_play to load vars for managed-node3 13830 1727204074.15687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204074.15910: done with get_vars() 13830 1727204074.15921: done getting variables 13830 1727204074.15978: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:54:34 -0400 (0:00:01.312) 0:00:07.238 ***** 13830 1727204074.16008: entering _queue_task() for managed-node3/package 13830 1727204074.16273: worker is 1 (out of 1 available) 13830 1727204074.16286: exiting _queue_task() for managed-node3/package 13830 1727204074.16298: done queuing things up, now waiting for results queue to drain 13830 1727204074.16300: waiting for pending results... 13830 1727204074.16553: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 13830 1727204074.16739: in run() - task 0affcd87-79f5-1659-6b02-000000000113 13830 1727204074.16759: variable 'ansible_search_path' from source: unknown 13830 1727204074.16769: variable 'ansible_search_path' from source: unknown 13830 1727204074.16845: calling self._execute() 13830 1727204074.16975: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204074.16991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204074.17003: variable 'omit' from source: magic vars 13830 1727204074.17392: variable 'ansible_distribution_major_version' from source: facts 13830 1727204074.17412: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204074.17549: variable 'ansible_os_family' from source: facts 13830 1727204074.17552: Evaluated conditional (ansible_os_family == 'RedHat'): True 13830 1727204074.17726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204074.18020: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204074.18074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204074.18116: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204074.18155: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204074.18243: variable 'ansible_distribution_major_version' from source: facts 13830 1727204074.18261: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 13830 1727204074.18276: when evaluation is False, skipping this task 13830 1727204074.18280: _execute() done 13830 1727204074.18282: dumping result to json 13830 1727204074.18287: done dumping result, returning 13830 1727204074.18292: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [0affcd87-79f5-1659-6b02-000000000113] 13830 1727204074.18297: sending task result for task 0affcd87-79f5-1659-6b02-000000000113 13830 1727204074.18410: done sending task result for task 0affcd87-79f5-1659-6b02-000000000113 13830 1727204074.18413: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 13830 1727204074.18472: no more pending results, returning what we have 13830 1727204074.18476: results queue empty 13830 1727204074.18477: checking for any_errors_fatal 13830 1727204074.18495: done checking for any_errors_fatal 13830 1727204074.18496: checking for max_fail_percentage 13830 1727204074.18497: done checking for max_fail_percentage 13830 1727204074.18498: checking to see if all hosts have failed and the running result is not ok 13830 1727204074.18499: done checking to see if all hosts have failed 13830 1727204074.18500: getting the remaining hosts for this loop 13830 1727204074.18502: done getting the remaining hosts for this loop 13830 1727204074.18506: getting the next task for host managed-node3 13830 1727204074.18511: done getting next task for host managed-node3 13830 1727204074.18514: ^ task is: TASK: Install pgrep, sysctl 13830 1727204074.18517: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204074.18521: getting variables 13830 1727204074.18522: in VariableManager get_vars() 13830 1727204074.18550: Calling all_inventory to load vars for managed-node3 13830 1727204074.18553: Calling groups_inventory to load vars for managed-node3 13830 1727204074.18556: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204074.18566: Calling all_plugins_play to load vars for managed-node3 13830 1727204074.18597: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204074.18602: Calling groups_plugins_play to load vars for managed-node3 13830 1727204074.18794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204074.19007: done with get_vars() 13830 1727204074.19017: done getting variables 13830 1727204074.19077: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.030) 0:00:07.269 ***** 13830 1727204074.19111: entering _queue_task() for managed-node3/package 13830 1727204074.19354: worker is 1 (out of 1 available) 13830 1727204074.19367: exiting _queue_task() for managed-node3/package 13830 1727204074.19379: done queuing things up, now waiting for results queue to drain 13830 1727204074.19380: waiting for pending results... 13830 1727204074.19642: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 13830 1727204074.19737: in run() - task 0affcd87-79f5-1659-6b02-000000000114 13830 1727204074.19753: variable 'ansible_search_path' from source: unknown 13830 1727204074.19779: variable 'ansible_search_path' from source: unknown 13830 1727204074.19817: calling self._execute() 13830 1727204074.19914: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204074.19934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204074.19948: variable 'omit' from source: magic vars 13830 1727204074.21340: variable 'ansible_distribution_major_version' from source: facts 13830 1727204074.21357: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204074.21509: variable 'ansible_os_family' from source: facts 13830 1727204074.21526: Evaluated conditional (ansible_os_family == 'RedHat'): True 13830 1727204074.21705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204074.22083: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204074.22132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204074.22191: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204074.22231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204074.22315: variable 'ansible_distribution_major_version' from source: facts 13830 1727204074.22333: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 13830 1727204074.22344: variable 'omit' from source: magic vars 13830 1727204074.22421: variable 'omit' from source: magic vars 13830 1727204074.22589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204074.25849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204074.25940: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204074.25989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204074.26032: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204074.26068: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204074.26174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204074.26253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204074.26288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204074.26341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204074.26411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204074.26532: variable '__network_is_ostree' from source: set_fact 13830 1727204074.26544: variable 'omit' from source: magic vars 13830 1727204074.26627: variable 'omit' from source: magic vars 13830 1727204074.26661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204074.26697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204074.26723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204074.26755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204074.26773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204074.26810: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204074.26820: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204074.26828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204074.26941: Set connection var ansible_connection to ssh 13830 1727204074.26967: Set connection var ansible_timeout to 10 13830 1727204074.26979: Set connection var ansible_shell_executable to /bin/sh 13830 1727204074.26986: Set connection var ansible_shell_type to sh 13830 1727204074.26995: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204074.27009: Set connection var ansible_pipelining to False 13830 1727204074.27041: variable 'ansible_shell_executable' from source: unknown 13830 1727204074.27049: variable 'ansible_connection' from source: unknown 13830 1727204074.27063: variable 'ansible_module_compression' from source: unknown 13830 1727204074.27074: variable 'ansible_shell_type' from source: unknown 13830 1727204074.27081: variable 'ansible_shell_executable' from source: unknown 13830 1727204074.27088: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204074.27095: variable 'ansible_pipelining' from source: unknown 13830 1727204074.27165: variable 'ansible_timeout' from source: unknown 13830 1727204074.27182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204074.27355: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204074.27373: variable 'omit' from source: magic vars 13830 1727204074.27385: starting attempt loop 13830 1727204074.27398: running the handler 13830 1727204074.27410: variable 'ansible_facts' from source: unknown 13830 1727204074.27417: variable 'ansible_facts' from source: unknown 13830 1727204074.27476: _low_level_execute_command(): starting 13830 1727204074.27488: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204074.28245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204074.28263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204074.28287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.28305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.28350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204074.28362: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204074.28381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.28401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204074.28423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204074.28436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204074.28448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204074.28460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.28481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.28497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204074.28507: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204074.28520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.28606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204074.28634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204074.28650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204074.28740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204074.30342: stdout chunk (state=3): >>>/root <<< 13830 1727204074.30528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204074.30535: stdout chunk (state=3): >>><<< 13830 1727204074.30537: stderr chunk (state=3): >>><<< 13830 1727204074.30660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204074.30668: _low_level_execute_command(): starting 13830 1727204074.30672: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040 `" && echo ansible-tmp-1727204074.3055935-14641-39569926294040="` echo /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040 `" ) && sleep 0' 13830 1727204074.32001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204074.32005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.32041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204074.32045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.32048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.32105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204074.32688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204074.32695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204074.32754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204074.34614: stdout chunk (state=3): >>>ansible-tmp-1727204074.3055935-14641-39569926294040=/root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040 <<< 13830 1727204074.34714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204074.34796: stderr chunk (state=3): >>><<< 13830 1727204074.34799: stdout chunk (state=3): >>><<< 13830 1727204074.35076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204074.3055935-14641-39569926294040=/root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204074.35080: variable 'ansible_module_compression' from source: unknown 13830 1727204074.35082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13830 1727204074.35084: variable 'ansible_facts' from source: unknown 13830 1727204074.35091: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040/AnsiballZ_dnf.py 13830 1727204074.35367: Sending initial data 13830 1727204074.35377: Sent initial data (151 bytes) 13830 1727204074.36824: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.36831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.36839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204074.36851: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204074.36871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.36890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204074.36902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204074.36912: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204074.36928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204074.36943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.36958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.36972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204074.36983: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204074.36994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.37066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204074.37090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204074.37105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204074.37175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204074.38861: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204074.38905: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204074.38947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpo5v34o28 /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040/AnsiballZ_dnf.py <<< 13830 1727204074.38981: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204074.40885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204074.40985: stderr chunk (state=3): >>><<< 13830 1727204074.40989: stdout chunk (state=3): >>><<< 13830 1727204074.41016: done transferring module to remote 13830 1727204074.41019: _low_level_execute_command(): starting 13830 1727204074.41022: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040/ /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040/AnsiballZ_dnf.py && sleep 0' 13830 1727204074.42056: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204074.42060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.42104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204074.42107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.42110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.42169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204074.42186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204074.42246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204074.43937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204074.44089: stderr chunk (state=3): >>><<< 13830 1727204074.44093: stdout chunk (state=3): >>><<< 13830 1727204074.44190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204074.44194: _low_level_execute_command(): starting 13830 1727204074.44196: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040/AnsiballZ_dnf.py && sleep 0' 13830 1727204074.45113: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204074.45121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204074.45134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.45153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.45198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204074.45211: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204074.45225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.45242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204074.45254: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204074.45272: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204074.45287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204074.45301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204074.45317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204074.45330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204074.45342: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204074.45357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204074.45436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204074.45457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204074.45477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204074.45561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204075.37754: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13830 1727204075.41916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204075.41979: stderr chunk (state=3): >>><<< 13830 1727204075.41982: stdout chunk (state=3): >>><<< 13830 1727204075.42001: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204075.42037: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204075.42044: _low_level_execute_command(): starting 13830 1727204075.42049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204074.3055935-14641-39569926294040/ > /dev/null 2>&1 && sleep 0' 13830 1727204075.42542: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.42559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.42573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.42585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.42596: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.42642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204075.42661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204075.42702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204075.44471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204075.44523: stderr chunk (state=3): >>><<< 13830 1727204075.44527: stdout chunk (state=3): >>><<< 13830 1727204075.44544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204075.44551: handler run complete 13830 1727204075.44583: attempt loop complete, returning result 13830 1727204075.44586: _execute() done 13830 1727204075.44588: dumping result to json 13830 1727204075.44591: done dumping result, returning 13830 1727204075.44600: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [0affcd87-79f5-1659-6b02-000000000114] 13830 1727204075.44604: sending task result for task 0affcd87-79f5-1659-6b02-000000000114 13830 1727204075.44703: done sending task result for task 0affcd87-79f5-1659-6b02-000000000114 13830 1727204075.44705: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13830 1727204075.44780: no more pending results, returning what we have 13830 1727204075.44783: results queue empty 13830 1727204075.44784: checking for any_errors_fatal 13830 1727204075.44795: done checking for any_errors_fatal 13830 1727204075.44795: checking for max_fail_percentage 13830 1727204075.44797: done checking for max_fail_percentage 13830 1727204075.44798: checking to see if all hosts have failed and the running result is not ok 13830 1727204075.44799: done checking to see if all hosts have failed 13830 1727204075.44799: getting the remaining hosts for this loop 13830 1727204075.44801: done getting the remaining hosts for this loop 13830 1727204075.44805: getting the next task for host managed-node3 13830 1727204075.44811: done getting next task for host managed-node3 13830 1727204075.44815: ^ task is: TASK: Create test interfaces 13830 1727204075.44819: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204075.44822: getting variables 13830 1727204075.44824: in VariableManager get_vars() 13830 1727204075.44853: Calling all_inventory to load vars for managed-node3 13830 1727204075.44856: Calling groups_inventory to load vars for managed-node3 13830 1727204075.44859: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204075.44876: Calling all_plugins_play to load vars for managed-node3 13830 1727204075.44879: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204075.44882: Calling groups_plugins_play to load vars for managed-node3 13830 1727204075.45051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204075.45172: done with get_vars() 13830 1727204075.45181: done getting variables 13830 1727204075.45252: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:54:35 -0400 (0:00:01.261) 0:00:08.530 ***** 13830 1727204075.45275: entering _queue_task() for managed-node3/shell 13830 1727204075.45276: Creating lock for shell 13830 1727204075.45485: worker is 1 (out of 1 available) 13830 1727204075.45498: exiting _queue_task() for managed-node3/shell 13830 1727204075.45510: done queuing things up, now waiting for results queue to drain 13830 1727204075.45511: waiting for pending results... 13830 1727204075.45669: running TaskExecutor() for managed-node3/TASK: Create test interfaces 13830 1727204075.45736: in run() - task 0affcd87-79f5-1659-6b02-000000000115 13830 1727204075.45752: variable 'ansible_search_path' from source: unknown 13830 1727204075.45755: variable 'ansible_search_path' from source: unknown 13830 1727204075.45783: calling self._execute() 13830 1727204075.45845: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204075.45856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204075.45867: variable 'omit' from source: magic vars 13830 1727204075.46218: variable 'ansible_distribution_major_version' from source: facts 13830 1727204075.46238: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204075.46248: variable 'omit' from source: magic vars 13830 1727204075.46309: variable 'omit' from source: magic vars 13830 1727204075.46698: variable 'dhcp_interface1' from source: play vars 13830 1727204075.46708: variable 'dhcp_interface2' from source: play vars 13830 1727204075.46743: variable 'omit' from source: magic vars 13830 1727204075.46791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204075.46834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204075.46863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204075.46889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204075.46904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204075.46941: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204075.46950: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204075.46957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204075.47056: Set connection var ansible_connection to ssh 13830 1727204075.47074: Set connection var ansible_timeout to 10 13830 1727204075.47083: Set connection var ansible_shell_executable to /bin/sh 13830 1727204075.47091: Set connection var ansible_shell_type to sh 13830 1727204075.47099: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204075.47112: Set connection var ansible_pipelining to False 13830 1727204075.47141: variable 'ansible_shell_executable' from source: unknown 13830 1727204075.47148: variable 'ansible_connection' from source: unknown 13830 1727204075.47155: variable 'ansible_module_compression' from source: unknown 13830 1727204075.47160: variable 'ansible_shell_type' from source: unknown 13830 1727204075.47169: variable 'ansible_shell_executable' from source: unknown 13830 1727204075.47175: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204075.47182: variable 'ansible_pipelining' from source: unknown 13830 1727204075.47188: variable 'ansible_timeout' from source: unknown 13830 1727204075.47195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204075.47339: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204075.47354: variable 'omit' from source: magic vars 13830 1727204075.47363: starting attempt loop 13830 1727204075.47371: running the handler 13830 1727204075.47384: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204075.47405: _low_level_execute_command(): starting 13830 1727204075.47418: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204075.48157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204075.48171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.48182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.48197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.48240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.48248: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204075.48258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.48274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204075.48285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204075.48292: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204075.48298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.48309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.48322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.48328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.48339: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204075.48348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.48424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204075.48446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204075.48458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204075.48541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204075.50066: stdout chunk (state=3): >>>/root <<< 13830 1727204075.50163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204075.50268: stderr chunk (state=3): >>><<< 13830 1727204075.50271: stdout chunk (state=3): >>><<< 13830 1727204075.50419: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204075.50433: _low_level_execute_command(): starting 13830 1727204075.50437: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093 `" && echo ansible-tmp-1727204075.5029533-14761-215225289425093="` echo /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093 `" ) && sleep 0' 13830 1727204075.51206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204075.51239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.51257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.51286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.51366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.51387: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204075.51422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.51447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204075.51459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204075.51473: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204075.51487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.51509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.51552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.51592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.51605: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204075.51623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.51710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204075.51768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204075.51786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204075.51873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204075.53692: stdout chunk (state=3): >>>ansible-tmp-1727204075.5029533-14761-215225289425093=/root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093 <<< 13830 1727204075.53895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204075.53899: stdout chunk (state=3): >>><<< 13830 1727204075.53901: stderr chunk (state=3): >>><<< 13830 1727204075.54171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204075.5029533-14761-215225289425093=/root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204075.54174: variable 'ansible_module_compression' from source: unknown 13830 1727204075.54177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204075.54179: variable 'ansible_facts' from source: unknown 13830 1727204075.54181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093/AnsiballZ_command.py 13830 1727204075.54330: Sending initial data 13830 1727204075.54334: Sent initial data (156 bytes) 13830 1727204075.55382: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204075.55406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.55423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.55443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.55491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.55511: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204075.55527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.55545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204075.55558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204075.55573: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204075.55586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.55600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.55624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.55638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.55651: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204075.55667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.55748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204075.55776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204075.55795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204075.55870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204075.57576: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204075.57618: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204075.57656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpjkzod6fm /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093/AnsiballZ_command.py <<< 13830 1727204075.58254: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204075.58759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204075.59037: stderr chunk (state=3): >>><<< 13830 1727204075.59041: stdout chunk (state=3): >>><<< 13830 1727204075.59043: done transferring module to remote 13830 1727204075.59045: _low_level_execute_command(): starting 13830 1727204075.59047: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093/ /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093/AnsiballZ_command.py && sleep 0' 13830 1727204075.59655: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204075.59671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.59691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.59710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.59757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.59771: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204075.59786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.59810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204075.59822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204075.59833: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204075.59846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.59859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.59878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.59891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.59904: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204075.59922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.59999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204075.60029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204075.60047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204075.60120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204075.61803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204075.61900: stderr chunk (state=3): >>><<< 13830 1727204075.61912: stdout chunk (state=3): >>><<< 13830 1727204075.61970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204075.61973: _low_level_execute_command(): starting 13830 1727204075.61976: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093/AnsiballZ_command.py && sleep 0' 13830 1727204075.62643: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204075.62657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.62674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.62699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.62743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.62755: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204075.62772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.62790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204075.62809: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204075.62821: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204075.62833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204075.62847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204075.62865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204075.62878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204075.62889: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204075.62909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204075.62986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204075.63006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204075.63027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204075.63102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204076.96900: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 13830 1727204076.96911: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:35.761035", "end": "2024-09-24 14:54:36.967432", "delta": "0:00:01.206397", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204076.98189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204076.98261: stderr chunk (state=3): >>><<< 13830 1727204076.98268: stdout chunk (state=3): >>><<< 13830 1727204076.98373: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:35.761035", "end": "2024-09-24 14:54:36.967432", "delta": "0:00:01.206397", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204076.98384: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204076.98467: _low_level_execute_command(): starting 13830 1727204076.98471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204075.5029533-14761-215225289425093/ > /dev/null 2>&1 && sleep 0' 13830 1727204076.99863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204076.99984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204076.99999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.00021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.00068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.00180: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.00195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.00211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.00222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.00232: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.00244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.00256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.00275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.00286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.00296: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.00309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.00585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.00610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.00627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.00704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.02585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.02589: stdout chunk (state=3): >>><<< 13830 1727204077.02592: stderr chunk (state=3): >>><<< 13830 1727204077.02771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.02776: handler run complete 13830 1727204077.02779: Evaluated conditional (False): False 13830 1727204077.02782: attempt loop complete, returning result 13830 1727204077.02785: _execute() done 13830 1727204077.02792: dumping result to json 13830 1727204077.02794: done dumping result, returning 13830 1727204077.02796: done running TaskExecutor() for managed-node3/TASK: Create test interfaces [0affcd87-79f5-1659-6b02-000000000115] 13830 1727204077.02798: sending task result for task 0affcd87-79f5-1659-6b02-000000000115 13830 1727204077.02889: done sending task result for task 0affcd87-79f5-1659-6b02-000000000115 13830 1727204077.02892: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.206397", "end": "2024-09-24 14:54:36.967432", "rc": 0, "start": "2024-09-24 14:54:35.761035" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 616 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 616 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 13830 1727204077.02970: no more pending results, returning what we have 13830 1727204077.02974: results queue empty 13830 1727204077.02974: checking for any_errors_fatal 13830 1727204077.02982: done checking for any_errors_fatal 13830 1727204077.02983: checking for max_fail_percentage 13830 1727204077.02985: done checking for max_fail_percentage 13830 1727204077.02986: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.02986: done checking to see if all hosts have failed 13830 1727204077.02987: getting the remaining hosts for this loop 13830 1727204077.02989: done getting the remaining hosts for this loop 13830 1727204077.02993: getting the next task for host managed-node3 13830 1727204077.03001: done getting next task for host managed-node3 13830 1727204077.03004: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13830 1727204077.03008: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.03012: getting variables 13830 1727204077.03014: in VariableManager get_vars() 13830 1727204077.03046: Calling all_inventory to load vars for managed-node3 13830 1727204077.03049: Calling groups_inventory to load vars for managed-node3 13830 1727204077.03053: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.03062: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.03066: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.03069: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.03237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.03562: done with get_vars() 13830 1727204077.03576: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:01.585) 0:00:10.116 ***** 13830 1727204077.03800: entering _queue_task() for managed-node3/include_tasks 13830 1727204077.04709: worker is 1 (out of 1 available) 13830 1727204077.04722: exiting _queue_task() for managed-node3/include_tasks 13830 1727204077.04734: done queuing things up, now waiting for results queue to drain 13830 1727204077.04736: waiting for pending results... 13830 1727204077.05411: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 13830 1727204077.05690: in run() - task 0affcd87-79f5-1659-6b02-00000000011c 13830 1727204077.05707: variable 'ansible_search_path' from source: unknown 13830 1727204077.05714: variable 'ansible_search_path' from source: unknown 13830 1727204077.05815: calling self._execute() 13830 1727204077.05898: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.05979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.06282: variable 'omit' from source: magic vars 13830 1727204077.06639: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.06784: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.06794: _execute() done 13830 1727204077.06801: dumping result to json 13830 1727204077.06809: done dumping result, returning 13830 1727204077.06818: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1659-6b02-00000000011c] 13830 1727204077.06828: sending task result for task 0affcd87-79f5-1659-6b02-00000000011c 13830 1727204077.06955: no more pending results, returning what we have 13830 1727204077.06963: in VariableManager get_vars() 13830 1727204077.07003: Calling all_inventory to load vars for managed-node3 13830 1727204077.07007: Calling groups_inventory to load vars for managed-node3 13830 1727204077.07010: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.07024: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.07026: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.07031: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.07251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.07476: done with get_vars() 13830 1727204077.07487: variable 'ansible_search_path' from source: unknown 13830 1727204077.07488: variable 'ansible_search_path' from source: unknown 13830 1727204077.07501: done sending task result for task 0affcd87-79f5-1659-6b02-00000000011c 13830 1727204077.07504: WORKER PROCESS EXITING 13830 1727204077.07533: we have included files to process 13830 1727204077.07534: generating all_blocks data 13830 1727204077.07536: done generating all_blocks data 13830 1727204077.07541: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204077.07542: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204077.07544: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204077.08010: done processing included file 13830 1727204077.08012: iterating over new_blocks loaded from include file 13830 1727204077.08014: in VariableManager get_vars() 13830 1727204077.08146: done with get_vars() 13830 1727204077.08148: filtering new block on tags 13830 1727204077.08183: done filtering new block on tags 13830 1727204077.08186: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 13830 1727204077.08191: extending task lists for all hosts with included blocks 13830 1727204077.08647: done extending task lists 13830 1727204077.08649: done processing included files 13830 1727204077.08650: results queue empty 13830 1727204077.08651: checking for any_errors_fatal 13830 1727204077.08657: done checking for any_errors_fatal 13830 1727204077.08658: checking for max_fail_percentage 13830 1727204077.08659: done checking for max_fail_percentage 13830 1727204077.08660: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.08661: done checking to see if all hosts have failed 13830 1727204077.08661: getting the remaining hosts for this loop 13830 1727204077.08663: done getting the remaining hosts for this loop 13830 1727204077.08668: getting the next task for host managed-node3 13830 1727204077.08673: done getting next task for host managed-node3 13830 1727204077.08675: ^ task is: TASK: Get stat for interface {{ interface }} 13830 1727204077.08679: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.08681: getting variables 13830 1727204077.08795: in VariableManager get_vars() 13830 1727204077.08809: Calling all_inventory to load vars for managed-node3 13830 1727204077.08812: Calling groups_inventory to load vars for managed-node3 13830 1727204077.08814: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.08820: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.08823: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.08826: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.09095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.09528: done with get_vars() 13830 1727204077.09542: done getting variables 13830 1727204077.09946: variable 'interface' from source: task vars 13830 1727204077.09951: variable 'dhcp_interface1' from source: play vars 13830 1727204077.10139: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.063) 0:00:10.179 ***** 13830 1727204077.10176: entering _queue_task() for managed-node3/stat 13830 1727204077.10839: worker is 1 (out of 1 available) 13830 1727204077.10852: exiting _queue_task() for managed-node3/stat 13830 1727204077.10979: done queuing things up, now waiting for results queue to drain 13830 1727204077.10981: waiting for pending results... 13830 1727204077.11720: running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 13830 1727204077.11859: in run() - task 0affcd87-79f5-1659-6b02-00000000017b 13830 1727204077.12086: variable 'ansible_search_path' from source: unknown 13830 1727204077.12093: variable 'ansible_search_path' from source: unknown 13830 1727204077.12134: calling self._execute() 13830 1727204077.12218: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.12279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.12294: variable 'omit' from source: magic vars 13830 1727204077.13234: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.13252: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.13262: variable 'omit' from source: magic vars 13830 1727204077.13347: variable 'omit' from source: magic vars 13830 1727204077.13593: variable 'interface' from source: task vars 13830 1727204077.13819: variable 'dhcp_interface1' from source: play vars 13830 1727204077.13891: variable 'dhcp_interface1' from source: play vars 13830 1727204077.13911: variable 'omit' from source: magic vars 13830 1727204077.14073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204077.14107: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204077.14128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204077.14266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.14277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.14306: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204077.14310: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.14312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.14531: Set connection var ansible_connection to ssh 13830 1727204077.14545: Set connection var ansible_timeout to 10 13830 1727204077.14551: Set connection var ansible_shell_executable to /bin/sh 13830 1727204077.14554: Set connection var ansible_shell_type to sh 13830 1727204077.14559: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204077.14572: Set connection var ansible_pipelining to False 13830 1727204077.14714: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.14720: variable 'ansible_connection' from source: unknown 13830 1727204077.14723: variable 'ansible_module_compression' from source: unknown 13830 1727204077.14726: variable 'ansible_shell_type' from source: unknown 13830 1727204077.14728: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.14730: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.14732: variable 'ansible_pipelining' from source: unknown 13830 1727204077.14735: variable 'ansible_timeout' from source: unknown 13830 1727204077.14737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.15180: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204077.15190: variable 'omit' from source: magic vars 13830 1727204077.15196: starting attempt loop 13830 1727204077.15199: running the handler 13830 1727204077.15213: _low_level_execute_command(): starting 13830 1727204077.15221: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204077.17523: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.17657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.17673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.17689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.17732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.17745: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.17762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.17878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.17885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.17892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.17900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.17910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.17921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.17928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.17939: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.17948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.18027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.18042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.18095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.18212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.19820: stdout chunk (state=3): >>>/root <<< 13830 1727204077.19985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.20039: stderr chunk (state=3): >>><<< 13830 1727204077.20042: stdout chunk (state=3): >>><<< 13830 1727204077.20156: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.20160: _low_level_execute_command(): starting 13830 1727204077.20163: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852 `" && echo ansible-tmp-1727204077.2006311-14811-180626830303852="` echo /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852 `" ) && sleep 0' 13830 1727204077.21776: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.21780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.21816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204077.21820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.21824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.21999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.22016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.22080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.23921: stdout chunk (state=3): >>>ansible-tmp-1727204077.2006311-14811-180626830303852=/root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852 <<< 13830 1727204077.24032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.24120: stderr chunk (state=3): >>><<< 13830 1727204077.24123: stdout chunk (state=3): >>><<< 13830 1727204077.24441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204077.2006311-14811-180626830303852=/root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.24444: variable 'ansible_module_compression' from source: unknown 13830 1727204077.24446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204077.24448: variable 'ansible_facts' from source: unknown 13830 1727204077.24450: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852/AnsiballZ_stat.py 13830 1727204077.24605: Sending initial data 13830 1727204077.24608: Sent initial data (153 bytes) 13830 1727204077.25763: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.25775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.25786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.25800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.25849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.25860: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.25882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.25896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.25903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.25910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.25919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.25929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.25948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.25955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.25965: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.25978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.26073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.26093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.26105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.26180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.27897: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204077.27966: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204077.27983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp6k_l00as /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852/AnsiballZ_stat.py <<< 13830 1727204077.28019: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204077.29208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.29470: stderr chunk (state=3): >>><<< 13830 1727204077.29473: stdout chunk (state=3): >>><<< 13830 1727204077.29475: done transferring module to remote 13830 1727204077.29478: _low_level_execute_command(): starting 13830 1727204077.29480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852/ /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852/AnsiballZ_stat.py && sleep 0' 13830 1727204077.30076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.30091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.30106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.30124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.30178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.30190: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.30203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.30221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.30232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.30247: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.30260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.30275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.30289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.30300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.30312: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.30325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.30405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.30428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.30445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.30519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.32298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.32303: stdout chunk (state=3): >>><<< 13830 1727204077.32307: stderr chunk (state=3): >>><<< 13830 1727204077.32327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.32330: _low_level_execute_command(): starting 13830 1727204077.32338: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852/AnsiballZ_stat.py && sleep 0' 13830 1727204077.33145: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.33199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.33229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.33242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.33246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.33275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.33338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.33344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.33416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.46522: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26463, "dev": 21, "nlink": 1, "atime": 1727204075.7688673, "mtime": 1727204075.7688673, "ctime": 1727204075.7688673, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204077.47447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204077.47503: stderr chunk (state=3): >>><<< 13830 1727204077.47507: stdout chunk (state=3): >>><<< 13830 1727204077.47526: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26463, "dev": 21, "nlink": 1, "atime": 1727204075.7688673, "mtime": 1727204075.7688673, "ctime": 1727204075.7688673, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204077.47569: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204077.47578: _low_level_execute_command(): starting 13830 1727204077.47583: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204077.2006311-14811-180626830303852/ > /dev/null 2>&1 && sleep 0' 13830 1727204077.48072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.48075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.48112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204077.48116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.48118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204077.48120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.48170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.48190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.48193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.48228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.49982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.50043: stderr chunk (state=3): >>><<< 13830 1727204077.50047: stdout chunk (state=3): >>><<< 13830 1727204077.50070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.50075: handler run complete 13830 1727204077.50106: attempt loop complete, returning result 13830 1727204077.50108: _execute() done 13830 1727204077.50111: dumping result to json 13830 1727204077.50116: done dumping result, returning 13830 1727204077.50127: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 [0affcd87-79f5-1659-6b02-00000000017b] 13830 1727204077.50135: sending task result for task 0affcd87-79f5-1659-6b02-00000000017b 13830 1727204077.50247: done sending task result for task 0affcd87-79f5-1659-6b02-00000000017b 13830 1727204077.50251: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204075.7688673, "block_size": 4096, "blocks": 0, "ctime": 1727204075.7688673, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26463, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204075.7688673, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13830 1727204077.50338: no more pending results, returning what we have 13830 1727204077.50341: results queue empty 13830 1727204077.50342: checking for any_errors_fatal 13830 1727204077.50344: done checking for any_errors_fatal 13830 1727204077.50344: checking for max_fail_percentage 13830 1727204077.50346: done checking for max_fail_percentage 13830 1727204077.50347: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.50348: done checking to see if all hosts have failed 13830 1727204077.50348: getting the remaining hosts for this loop 13830 1727204077.50350: done getting the remaining hosts for this loop 13830 1727204077.50354: getting the next task for host managed-node3 13830 1727204077.50361: done getting next task for host managed-node3 13830 1727204077.50365: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13830 1727204077.50370: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.50373: getting variables 13830 1727204077.50374: in VariableManager get_vars() 13830 1727204077.50401: Calling all_inventory to load vars for managed-node3 13830 1727204077.50409: Calling groups_inventory to load vars for managed-node3 13830 1727204077.50412: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.50422: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.50424: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.50427: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.50583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.50701: done with get_vars() 13830 1727204077.50709: done getting variables 13830 1727204077.50782: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 13830 1727204077.50872: variable 'interface' from source: task vars 13830 1727204077.50875: variable 'dhcp_interface1' from source: play vars 13830 1727204077.50919: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.407) 0:00:10.587 ***** 13830 1727204077.50944: entering _queue_task() for managed-node3/assert 13830 1727204077.50945: Creating lock for assert 13830 1727204077.51147: worker is 1 (out of 1 available) 13830 1727204077.51159: exiting _queue_task() for managed-node3/assert 13830 1727204077.51171: done queuing things up, now waiting for results queue to drain 13830 1727204077.51173: waiting for pending results... 13830 1727204077.51338: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' 13830 1727204077.51412: in run() - task 0affcd87-79f5-1659-6b02-00000000011d 13830 1727204077.51423: variable 'ansible_search_path' from source: unknown 13830 1727204077.51428: variable 'ansible_search_path' from source: unknown 13830 1727204077.51458: calling self._execute() 13830 1727204077.51520: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.51523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.51534: variable 'omit' from source: magic vars 13830 1727204077.51793: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.51804: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.51809: variable 'omit' from source: magic vars 13830 1727204077.51853: variable 'omit' from source: magic vars 13830 1727204077.51921: variable 'interface' from source: task vars 13830 1727204077.51925: variable 'dhcp_interface1' from source: play vars 13830 1727204077.51974: variable 'dhcp_interface1' from source: play vars 13830 1727204077.51990: variable 'omit' from source: magic vars 13830 1727204077.52024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204077.52052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204077.52070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204077.52083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.52093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.52116: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204077.52119: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.52122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.52191: Set connection var ansible_connection to ssh 13830 1727204077.52201: Set connection var ansible_timeout to 10 13830 1727204077.52206: Set connection var ansible_shell_executable to /bin/sh 13830 1727204077.52210: Set connection var ansible_shell_type to sh 13830 1727204077.52214: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204077.52225: Set connection var ansible_pipelining to False 13830 1727204077.52242: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.52245: variable 'ansible_connection' from source: unknown 13830 1727204077.52248: variable 'ansible_module_compression' from source: unknown 13830 1727204077.52250: variable 'ansible_shell_type' from source: unknown 13830 1727204077.52252: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.52254: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.52258: variable 'ansible_pipelining' from source: unknown 13830 1727204077.52261: variable 'ansible_timeout' from source: unknown 13830 1727204077.52269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.52373: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204077.52383: variable 'omit' from source: magic vars 13830 1727204077.52388: starting attempt loop 13830 1727204077.52391: running the handler 13830 1727204077.52484: variable 'interface_stat' from source: set_fact 13830 1727204077.52500: Evaluated conditional (interface_stat.stat.exists): True 13830 1727204077.52505: handler run complete 13830 1727204077.52516: attempt loop complete, returning result 13830 1727204077.52520: _execute() done 13830 1727204077.52523: dumping result to json 13830 1727204077.52527: done dumping result, returning 13830 1727204077.52533: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' [0affcd87-79f5-1659-6b02-00000000011d] 13830 1727204077.52543: sending task result for task 0affcd87-79f5-1659-6b02-00000000011d 13830 1727204077.52631: done sending task result for task 0affcd87-79f5-1659-6b02-00000000011d 13830 1727204077.52633: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204077.52686: no more pending results, returning what we have 13830 1727204077.52689: results queue empty 13830 1727204077.52690: checking for any_errors_fatal 13830 1727204077.52700: done checking for any_errors_fatal 13830 1727204077.52701: checking for max_fail_percentage 13830 1727204077.52703: done checking for max_fail_percentage 13830 1727204077.52704: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.52704: done checking to see if all hosts have failed 13830 1727204077.52705: getting the remaining hosts for this loop 13830 1727204077.52707: done getting the remaining hosts for this loop 13830 1727204077.52710: getting the next task for host managed-node3 13830 1727204077.52719: done getting next task for host managed-node3 13830 1727204077.52721: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13830 1727204077.52726: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.52732: getting variables 13830 1727204077.52733: in VariableManager get_vars() 13830 1727204077.52767: Calling all_inventory to load vars for managed-node3 13830 1727204077.52770: Calling groups_inventory to load vars for managed-node3 13830 1727204077.52774: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.52783: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.52785: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.52787: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.52915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.53035: done with get_vars() 13830 1727204077.53043: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.021) 0:00:10.609 ***** 13830 1727204077.53118: entering _queue_task() for managed-node3/include_tasks 13830 1727204077.53318: worker is 1 (out of 1 available) 13830 1727204077.53334: exiting _queue_task() for managed-node3/include_tasks 13830 1727204077.53346: done queuing things up, now waiting for results queue to drain 13830 1727204077.53348: waiting for pending results... 13830 1727204077.53504: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 13830 1727204077.53580: in run() - task 0affcd87-79f5-1659-6b02-000000000121 13830 1727204077.53591: variable 'ansible_search_path' from source: unknown 13830 1727204077.53594: variable 'ansible_search_path' from source: unknown 13830 1727204077.53622: calling self._execute() 13830 1727204077.53766: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.53770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.53778: variable 'omit' from source: magic vars 13830 1727204077.54039: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.54049: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.54058: _execute() done 13830 1727204077.54061: dumping result to json 13830 1727204077.54065: done dumping result, returning 13830 1727204077.54068: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1659-6b02-000000000121] 13830 1727204077.54072: sending task result for task 0affcd87-79f5-1659-6b02-000000000121 13830 1727204077.54159: done sending task result for task 0affcd87-79f5-1659-6b02-000000000121 13830 1727204077.54162: WORKER PROCESS EXITING 13830 1727204077.54193: no more pending results, returning what we have 13830 1727204077.54198: in VariableManager get_vars() 13830 1727204077.54302: Calling all_inventory to load vars for managed-node3 13830 1727204077.54305: Calling groups_inventory to load vars for managed-node3 13830 1727204077.54308: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.54317: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.54320: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.54323: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.54429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.54542: done with get_vars() 13830 1727204077.54547: variable 'ansible_search_path' from source: unknown 13830 1727204077.54548: variable 'ansible_search_path' from source: unknown 13830 1727204077.54573: we have included files to process 13830 1727204077.54574: generating all_blocks data 13830 1727204077.54576: done generating all_blocks data 13830 1727204077.54578: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204077.54579: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204077.54580: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204077.54708: done processing included file 13830 1727204077.54709: iterating over new_blocks loaded from include file 13830 1727204077.54710: in VariableManager get_vars() 13830 1727204077.54725: done with get_vars() 13830 1727204077.54726: filtering new block on tags 13830 1727204077.54746: done filtering new block on tags 13830 1727204077.54747: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 13830 1727204077.54751: extending task lists for all hosts with included blocks 13830 1727204077.54876: done extending task lists 13830 1727204077.54877: done processing included files 13830 1727204077.54878: results queue empty 13830 1727204077.54878: checking for any_errors_fatal 13830 1727204077.54880: done checking for any_errors_fatal 13830 1727204077.54881: checking for max_fail_percentage 13830 1727204077.54881: done checking for max_fail_percentage 13830 1727204077.54882: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.54883: done checking to see if all hosts have failed 13830 1727204077.54883: getting the remaining hosts for this loop 13830 1727204077.54884: done getting the remaining hosts for this loop 13830 1727204077.54886: getting the next task for host managed-node3 13830 1727204077.54889: done getting next task for host managed-node3 13830 1727204077.54890: ^ task is: TASK: Get stat for interface {{ interface }} 13830 1727204077.54893: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.54894: getting variables 13830 1727204077.54895: in VariableManager get_vars() 13830 1727204077.54901: Calling all_inventory to load vars for managed-node3 13830 1727204077.54902: Calling groups_inventory to load vars for managed-node3 13830 1727204077.54904: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.54908: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.54910: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.54911: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.54996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.55124: done with get_vars() 13830 1727204077.55131: done getting variables 13830 1727204077.55247: variable 'interface' from source: task vars 13830 1727204077.55249: variable 'dhcp_interface2' from source: play vars 13830 1727204077.55294: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.022) 0:00:10.631 ***** 13830 1727204077.55317: entering _queue_task() for managed-node3/stat 13830 1727204077.55534: worker is 1 (out of 1 available) 13830 1727204077.55547: exiting _queue_task() for managed-node3/stat 13830 1727204077.55558: done queuing things up, now waiting for results queue to drain 13830 1727204077.55560: waiting for pending results... 13830 1727204077.55720: running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 13830 1727204077.55800: in run() - task 0affcd87-79f5-1659-6b02-00000000019f 13830 1727204077.55811: variable 'ansible_search_path' from source: unknown 13830 1727204077.55814: variable 'ansible_search_path' from source: unknown 13830 1727204077.55842: calling self._execute() 13830 1727204077.55903: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.55908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.55917: variable 'omit' from source: magic vars 13830 1727204077.56170: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.56181: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.56187: variable 'omit' from source: magic vars 13830 1727204077.56234: variable 'omit' from source: magic vars 13830 1727204077.56302: variable 'interface' from source: task vars 13830 1727204077.56306: variable 'dhcp_interface2' from source: play vars 13830 1727204077.56353: variable 'dhcp_interface2' from source: play vars 13830 1727204077.56369: variable 'omit' from source: magic vars 13830 1727204077.56403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204077.56433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204077.56452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204077.56468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.56479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.56501: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204077.56504: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.56506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.56580: Set connection var ansible_connection to ssh 13830 1727204077.56589: Set connection var ansible_timeout to 10 13830 1727204077.56594: Set connection var ansible_shell_executable to /bin/sh 13830 1727204077.56597: Set connection var ansible_shell_type to sh 13830 1727204077.56602: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204077.56610: Set connection var ansible_pipelining to False 13830 1727204077.56627: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.56630: variable 'ansible_connection' from source: unknown 13830 1727204077.56633: variable 'ansible_module_compression' from source: unknown 13830 1727204077.56637: variable 'ansible_shell_type' from source: unknown 13830 1727204077.56640: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.56642: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.56644: variable 'ansible_pipelining' from source: unknown 13830 1727204077.56648: variable 'ansible_timeout' from source: unknown 13830 1727204077.56652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.56800: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204077.56809: variable 'omit' from source: magic vars 13830 1727204077.56814: starting attempt loop 13830 1727204077.56817: running the handler 13830 1727204077.56828: _low_level_execute_command(): starting 13830 1727204077.56837: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204077.57363: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.57381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.57395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204077.57407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.57420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.57467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.57481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.57533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.59084: stdout chunk (state=3): >>>/root <<< 13830 1727204077.59186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.59275: stderr chunk (state=3): >>><<< 13830 1727204077.59279: stdout chunk (state=3): >>><<< 13830 1727204077.59396: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.59401: _low_level_execute_command(): starting 13830 1727204077.59408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236 `" && echo ansible-tmp-1727204077.5930133-14840-169308077157236="` echo /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236 `" ) && sleep 0' 13830 1727204077.60078: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.60100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.60103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.60125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.60182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.60186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.60233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.62186: stdout chunk (state=3): >>>ansible-tmp-1727204077.5930133-14840-169308077157236=/root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236 <<< 13830 1727204077.62239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.62325: stderr chunk (state=3): >>><<< 13830 1727204077.62336: stdout chunk (state=3): >>><<< 13830 1727204077.62369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204077.5930133-14840-169308077157236=/root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.62670: variable 'ansible_module_compression' from source: unknown 13830 1727204077.62674: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204077.62676: variable 'ansible_facts' from source: unknown 13830 1727204077.62678: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236/AnsiballZ_stat.py 13830 1727204077.62751: Sending initial data 13830 1727204077.62754: Sent initial data (153 bytes) 13830 1727204077.63712: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.63725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.63739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.63757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.63802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.63814: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.63826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.63844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.63855: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.63868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.63880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.63893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.63908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.63919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.63929: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.63942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.64021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.64043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.64058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.64130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.65839: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204077.65879: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204077.65919: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpeyefm7iu /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236/AnsiballZ_stat.py <<< 13830 1727204077.65958: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204077.67082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.67288: stderr chunk (state=3): >>><<< 13830 1727204077.67291: stdout chunk (state=3): >>><<< 13830 1727204077.67294: done transferring module to remote 13830 1727204077.67296: _low_level_execute_command(): starting 13830 1727204077.67298: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236/ /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236/AnsiballZ_stat.py && sleep 0' 13830 1727204077.67940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.67961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.67978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.67997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.68048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.68068: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.68083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.68106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.68117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.68127: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.68142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.68154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.68178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.68190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.68199: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.68212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.68301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.68324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.68344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.68423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.70113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.70218: stderr chunk (state=3): >>><<< 13830 1727204077.70237: stdout chunk (state=3): >>><<< 13830 1727204077.70354: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.70358: _low_level_execute_command(): starting 13830 1727204077.70360: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236/AnsiballZ_stat.py && sleep 0' 13830 1727204077.70990: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.71009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.71031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.71051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.71097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.71113: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.71139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.71160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.71177: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.71190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.71203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.71220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.71249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.71266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.71280: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.71296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.71389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.71415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.71437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.71527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.84583: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27049, "dev": 21, "nlink": 1, "atime": 1727204075.7765117, "mtime": 1727204075.7765117, "ctime": 1727204075.7765117, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204077.85536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204077.85591: stderr chunk (state=3): >>><<< 13830 1727204077.85595: stdout chunk (state=3): >>><<< 13830 1727204077.85611: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27049, "dev": 21, "nlink": 1, "atime": 1727204075.7765117, "mtime": 1727204075.7765117, "ctime": 1727204075.7765117, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204077.85658: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204077.85667: _low_level_execute_command(): starting 13830 1727204077.85672: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204077.5930133-14840-169308077157236/ > /dev/null 2>&1 && sleep 0' 13830 1727204077.86339: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204077.86355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.86372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.86391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.86441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.86453: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204077.86468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.86486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204077.86500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204077.86511: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204077.86526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204077.86543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204077.86557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204077.86571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204077.86582: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204077.86595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204077.86679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204077.86718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204077.86750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204077.86849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204077.88570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204077.88627: stderr chunk (state=3): >>><<< 13830 1727204077.88631: stdout chunk (state=3): >>><<< 13830 1727204077.88647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204077.88654: handler run complete 13830 1727204077.88687: attempt loop complete, returning result 13830 1727204077.88690: _execute() done 13830 1727204077.88693: dumping result to json 13830 1727204077.88697: done dumping result, returning 13830 1727204077.88708: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 [0affcd87-79f5-1659-6b02-00000000019f] 13830 1727204077.88712: sending task result for task 0affcd87-79f5-1659-6b02-00000000019f 13830 1727204077.88825: done sending task result for task 0affcd87-79f5-1659-6b02-00000000019f 13830 1727204077.88828: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204075.7765117, "block_size": 4096, "blocks": 0, "ctime": 1727204075.7765117, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27049, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204075.7765117, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13830 1727204077.88915: no more pending results, returning what we have 13830 1727204077.88919: results queue empty 13830 1727204077.88920: checking for any_errors_fatal 13830 1727204077.88921: done checking for any_errors_fatal 13830 1727204077.88922: checking for max_fail_percentage 13830 1727204077.88923: done checking for max_fail_percentage 13830 1727204077.88924: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.88925: done checking to see if all hosts have failed 13830 1727204077.88926: getting the remaining hosts for this loop 13830 1727204077.88927: done getting the remaining hosts for this loop 13830 1727204077.88932: getting the next task for host managed-node3 13830 1727204077.88940: done getting next task for host managed-node3 13830 1727204077.88942: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13830 1727204077.88946: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.88953: getting variables 13830 1727204077.88955: in VariableManager get_vars() 13830 1727204077.88983: Calling all_inventory to load vars for managed-node3 13830 1727204077.88986: Calling groups_inventory to load vars for managed-node3 13830 1727204077.88989: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.88999: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.89001: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.89003: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.89119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.89238: done with get_vars() 13830 1727204077.89247: done getting variables 13830 1727204077.89291: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204077.89515: variable 'interface' from source: task vars 13830 1727204077.89519: variable 'dhcp_interface2' from source: play vars 13830 1727204077.89580: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.342) 0:00:10.974 ***** 13830 1727204077.89612: entering _queue_task() for managed-node3/assert 13830 1727204077.89861: worker is 1 (out of 1 available) 13830 1727204077.89875: exiting _queue_task() for managed-node3/assert 13830 1727204077.89885: done queuing things up, now waiting for results queue to drain 13830 1727204077.89887: waiting for pending results... 13830 1727204077.90138: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' 13830 1727204077.90268: in run() - task 0affcd87-79f5-1659-6b02-000000000122 13830 1727204077.90287: variable 'ansible_search_path' from source: unknown 13830 1727204077.90294: variable 'ansible_search_path' from source: unknown 13830 1727204077.90338: calling self._execute() 13830 1727204077.90419: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.90430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.90447: variable 'omit' from source: magic vars 13830 1727204077.90873: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.90891: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.90902: variable 'omit' from source: magic vars 13830 1727204077.90965: variable 'omit' from source: magic vars 13830 1727204077.91060: variable 'interface' from source: task vars 13830 1727204077.91072: variable 'dhcp_interface2' from source: play vars 13830 1727204077.91138: variable 'dhcp_interface2' from source: play vars 13830 1727204077.91161: variable 'omit' from source: magic vars 13830 1727204077.91217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204077.91256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204077.91285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204077.91312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.91326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204077.91359: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204077.91370: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.91377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.91476: Set connection var ansible_connection to ssh 13830 1727204077.91491: Set connection var ansible_timeout to 10 13830 1727204077.91499: Set connection var ansible_shell_executable to /bin/sh 13830 1727204077.91504: Set connection var ansible_shell_type to sh 13830 1727204077.91512: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204077.91534: Set connection var ansible_pipelining to False 13830 1727204077.91561: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.91571: variable 'ansible_connection' from source: unknown 13830 1727204077.91578: variable 'ansible_module_compression' from source: unknown 13830 1727204077.91583: variable 'ansible_shell_type' from source: unknown 13830 1727204077.91588: variable 'ansible_shell_executable' from source: unknown 13830 1727204077.91594: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.91600: variable 'ansible_pipelining' from source: unknown 13830 1727204077.91606: variable 'ansible_timeout' from source: unknown 13830 1727204077.91614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.91756: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204077.91775: variable 'omit' from source: magic vars 13830 1727204077.91785: starting attempt loop 13830 1727204077.91791: running the handler 13830 1727204077.91930: variable 'interface_stat' from source: set_fact 13830 1727204077.91959: Evaluated conditional (interface_stat.stat.exists): True 13830 1727204077.91971: handler run complete 13830 1727204077.91989: attempt loop complete, returning result 13830 1727204077.91995: _execute() done 13830 1727204077.92001: dumping result to json 13830 1727204077.92008: done dumping result, returning 13830 1727204077.92018: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' [0affcd87-79f5-1659-6b02-000000000122] 13830 1727204077.92026: sending task result for task 0affcd87-79f5-1659-6b02-000000000122 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204077.92174: no more pending results, returning what we have 13830 1727204077.92178: results queue empty 13830 1727204077.92179: checking for any_errors_fatal 13830 1727204077.92188: done checking for any_errors_fatal 13830 1727204077.92188: checking for max_fail_percentage 13830 1727204077.92190: done checking for max_fail_percentage 13830 1727204077.92191: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.92192: done checking to see if all hosts have failed 13830 1727204077.92193: getting the remaining hosts for this loop 13830 1727204077.92194: done getting the remaining hosts for this loop 13830 1727204077.92199: getting the next task for host managed-node3 13830 1727204077.92208: done getting next task for host managed-node3 13830 1727204077.92211: ^ task is: TASK: Test 13830 1727204077.92215: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.92219: getting variables 13830 1727204077.92221: in VariableManager get_vars() 13830 1727204077.92251: Calling all_inventory to load vars for managed-node3 13830 1727204077.92254: Calling groups_inventory to load vars for managed-node3 13830 1727204077.92258: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.92271: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.92273: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.92276: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.92498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.92705: done with get_vars() 13830 1727204077.92715: done getting variables 13830 1727204077.93029: done sending task result for task 0affcd87-79f5-1659-6b02-000000000122 13830 1727204077.93032: WORKER PROCESS EXITING TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.034) 0:00:11.008 ***** 13830 1727204077.93077: entering _queue_task() for managed-node3/include_tasks 13830 1727204077.93323: worker is 1 (out of 1 available) 13830 1727204077.93335: exiting _queue_task() for managed-node3/include_tasks 13830 1727204077.93347: done queuing things up, now waiting for results queue to drain 13830 1727204077.93348: waiting for pending results... 13830 1727204077.93602: running TaskExecutor() for managed-node3/TASK: Test 13830 1727204077.93705: in run() - task 0affcd87-79f5-1659-6b02-00000000008c 13830 1727204077.93724: variable 'ansible_search_path' from source: unknown 13830 1727204077.93732: variable 'ansible_search_path' from source: unknown 13830 1727204077.93780: variable 'lsr_test' from source: include params 13830 1727204077.93980: variable 'lsr_test' from source: include params 13830 1727204077.94053: variable 'omit' from source: magic vars 13830 1727204077.94176: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.94183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.94191: variable 'omit' from source: magic vars 13830 1727204077.94359: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.94368: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.94377: variable 'item' from source: unknown 13830 1727204077.94428: variable 'item' from source: unknown 13830 1727204077.94452: variable 'item' from source: unknown 13830 1727204077.94499: variable 'item' from source: unknown 13830 1727204077.94621: dumping result to json 13830 1727204077.94624: done dumping result, returning 13830 1727204077.94628: done running TaskExecutor() for managed-node3/TASK: Test [0affcd87-79f5-1659-6b02-00000000008c] 13830 1727204077.94633: sending task result for task 0affcd87-79f5-1659-6b02-00000000008c 13830 1727204077.94671: done sending task result for task 0affcd87-79f5-1659-6b02-00000000008c 13830 1727204077.94673: WORKER PROCESS EXITING 13830 1727204077.94694: no more pending results, returning what we have 13830 1727204077.94699: in VariableManager get_vars() 13830 1727204077.94733: Calling all_inventory to load vars for managed-node3 13830 1727204077.94736: Calling groups_inventory to load vars for managed-node3 13830 1727204077.94738: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.94748: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.94751: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.94753: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.94877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.94991: done with get_vars() 13830 1727204077.94997: variable 'ansible_search_path' from source: unknown 13830 1727204077.94998: variable 'ansible_search_path' from source: unknown 13830 1727204077.95025: we have included files to process 13830 1727204077.95026: generating all_blocks data 13830 1727204077.95027: done generating all_blocks data 13830 1727204077.95032: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 13830 1727204077.95032: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 13830 1727204077.95034: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 13830 1727204077.95333: done processing included file 13830 1727204077.95334: iterating over new_blocks loaded from include file 13830 1727204077.95335: in VariableManager get_vars() 13830 1727204077.95345: done with get_vars() 13830 1727204077.95346: filtering new block on tags 13830 1727204077.95370: done filtering new block on tags 13830 1727204077.95372: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml for managed-node3 => (item=tasks/create_bond_profile.yml) 13830 1727204077.95375: extending task lists for all hosts with included blocks 13830 1727204077.96305: done extending task lists 13830 1727204077.96306: done processing included files 13830 1727204077.96307: results queue empty 13830 1727204077.96308: checking for any_errors_fatal 13830 1727204077.96314: done checking for any_errors_fatal 13830 1727204077.96315: checking for max_fail_percentage 13830 1727204077.96316: done checking for max_fail_percentage 13830 1727204077.96317: checking to see if all hosts have failed and the running result is not ok 13830 1727204077.96318: done checking to see if all hosts have failed 13830 1727204077.96319: getting the remaining hosts for this loop 13830 1727204077.96320: done getting the remaining hosts for this loop 13830 1727204077.96323: getting the next task for host managed-node3 13830 1727204077.96327: done getting next task for host managed-node3 13830 1727204077.96329: ^ task is: TASK: Include network role 13830 1727204077.96331: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204077.96334: getting variables 13830 1727204077.96335: in VariableManager get_vars() 13830 1727204077.96344: Calling all_inventory to load vars for managed-node3 13830 1727204077.96346: Calling groups_inventory to load vars for managed-node3 13830 1727204077.96348: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.96353: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.96356: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.96359: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.96515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.96726: done with get_vars() 13830 1727204077.96739: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.037) 0:00:11.046 ***** 13830 1727204077.96822: entering _queue_task() for managed-node3/include_role 13830 1727204077.96824: Creating lock for include_role 13830 1727204077.97149: worker is 1 (out of 1 available) 13830 1727204077.97161: exiting _queue_task() for managed-node3/include_role 13830 1727204077.97178: done queuing things up, now waiting for results queue to drain 13830 1727204077.97183: waiting for pending results... 13830 1727204077.97451: running TaskExecutor() for managed-node3/TASK: Include network role 13830 1727204077.97586: in run() - task 0affcd87-79f5-1659-6b02-0000000001c5 13830 1727204077.97608: variable 'ansible_search_path' from source: unknown 13830 1727204077.97620: variable 'ansible_search_path' from source: unknown 13830 1727204077.97666: calling self._execute() 13830 1727204077.97752: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204077.97762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204077.97776: variable 'omit' from source: magic vars 13830 1727204077.98155: variable 'ansible_distribution_major_version' from source: facts 13830 1727204077.98178: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204077.98187: _execute() done 13830 1727204077.98194: dumping result to json 13830 1727204077.98201: done dumping result, returning 13830 1727204077.98209: done running TaskExecutor() for managed-node3/TASK: Include network role [0affcd87-79f5-1659-6b02-0000000001c5] 13830 1727204077.98218: sending task result for task 0affcd87-79f5-1659-6b02-0000000001c5 13830 1727204077.98401: no more pending results, returning what we have 13830 1727204077.98406: in VariableManager get_vars() 13830 1727204077.98439: Calling all_inventory to load vars for managed-node3 13830 1727204077.98443: Calling groups_inventory to load vars for managed-node3 13830 1727204077.98446: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204077.98460: Calling all_plugins_play to load vars for managed-node3 13830 1727204077.98462: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204077.98474: Calling groups_plugins_play to load vars for managed-node3 13830 1727204077.98692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204077.98908: done with get_vars() 13830 1727204077.98915: variable 'ansible_search_path' from source: unknown 13830 1727204077.98916: variable 'ansible_search_path' from source: unknown 13830 1727204077.99228: variable 'omit' from source: magic vars 13830 1727204077.99272: variable 'omit' from source: magic vars 13830 1727204077.99289: variable 'omit' from source: magic vars 13830 1727204077.99293: we have included files to process 13830 1727204077.99294: generating all_blocks data 13830 1727204077.99297: done generating all_blocks data 13830 1727204077.99298: processing included file: fedora.linux_system_roles.network 13830 1727204077.99438: in VariableManager get_vars() 13830 1727204077.99450: done with get_vars() 13830 1727204077.99519: done sending task result for task 0affcd87-79f5-1659-6b02-0000000001c5 13830 1727204077.99523: WORKER PROCESS EXITING 13830 1727204077.99575: in VariableManager get_vars() 13830 1727204077.99593: done with get_vars() 13830 1727204077.99646: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13830 1727204078.00113: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13830 1727204078.00256: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13830 1727204078.00993: in VariableManager get_vars() 13830 1727204078.01013: done with get_vars() 13830 1727204078.01459: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204078.03288: iterating over new_blocks loaded from include file 13830 1727204078.03290: in VariableManager get_vars() 13830 1727204078.03307: done with get_vars() 13830 1727204078.03309: filtering new block on tags 13830 1727204078.03574: done filtering new block on tags 13830 1727204078.03653: in VariableManager get_vars() 13830 1727204078.03673: done with get_vars() 13830 1727204078.03675: filtering new block on tags 13830 1727204078.03691: done filtering new block on tags 13830 1727204078.03693: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node3 13830 1727204078.03699: extending task lists for all hosts with included blocks 13830 1727204078.03875: done extending task lists 13830 1727204078.03876: done processing included files 13830 1727204078.03877: results queue empty 13830 1727204078.03878: checking for any_errors_fatal 13830 1727204078.03881: done checking for any_errors_fatal 13830 1727204078.03882: checking for max_fail_percentage 13830 1727204078.03883: done checking for max_fail_percentage 13830 1727204078.03883: checking to see if all hosts have failed and the running result is not ok 13830 1727204078.03884: done checking to see if all hosts have failed 13830 1727204078.03885: getting the remaining hosts for this loop 13830 1727204078.03886: done getting the remaining hosts for this loop 13830 1727204078.03888: getting the next task for host managed-node3 13830 1727204078.03892: done getting next task for host managed-node3 13830 1727204078.03895: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204078.03898: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204078.03906: getting variables 13830 1727204078.03907: in VariableManager get_vars() 13830 1727204078.03920: Calling all_inventory to load vars for managed-node3 13830 1727204078.03922: Calling groups_inventory to load vars for managed-node3 13830 1727204078.03924: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204078.03931: Calling all_plugins_play to load vars for managed-node3 13830 1727204078.03934: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204078.03937: Calling groups_plugins_play to load vars for managed-node3 13830 1727204078.04105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204078.04311: done with get_vars() 13830 1727204078.04320: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.075) 0:00:11.122 ***** 13830 1727204078.04395: entering _queue_task() for managed-node3/include_tasks 13830 1727204078.04695: worker is 1 (out of 1 available) 13830 1727204078.04708: exiting _queue_task() for managed-node3/include_tasks 13830 1727204078.04723: done queuing things up, now waiting for results queue to drain 13830 1727204078.04725: waiting for pending results... 13830 1727204078.05004: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204078.05149: in run() - task 0affcd87-79f5-1659-6b02-000000000277 13830 1727204078.05179: variable 'ansible_search_path' from source: unknown 13830 1727204078.05187: variable 'ansible_search_path' from source: unknown 13830 1727204078.05226: calling self._execute() 13830 1727204078.05321: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204078.05334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204078.05347: variable 'omit' from source: magic vars 13830 1727204078.05731: variable 'ansible_distribution_major_version' from source: facts 13830 1727204078.05749: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204078.05759: _execute() done 13830 1727204078.05770: dumping result to json 13830 1727204078.05779: done dumping result, returning 13830 1727204078.05791: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1659-6b02-000000000277] 13830 1727204078.05801: sending task result for task 0affcd87-79f5-1659-6b02-000000000277 13830 1727204078.05915: done sending task result for task 0affcd87-79f5-1659-6b02-000000000277 13830 1727204078.05922: WORKER PROCESS EXITING 13830 1727204078.05972: no more pending results, returning what we have 13830 1727204078.05978: in VariableManager get_vars() 13830 1727204078.06023: Calling all_inventory to load vars for managed-node3 13830 1727204078.06027: Calling groups_inventory to load vars for managed-node3 13830 1727204078.06031: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204078.06045: Calling all_plugins_play to load vars for managed-node3 13830 1727204078.06048: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204078.06051: Calling groups_plugins_play to load vars for managed-node3 13830 1727204078.06248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204078.06462: done with get_vars() 13830 1727204078.06472: variable 'ansible_search_path' from source: unknown 13830 1727204078.06473: variable 'ansible_search_path' from source: unknown 13830 1727204078.06517: we have included files to process 13830 1727204078.06518: generating all_blocks data 13830 1727204078.06520: done generating all_blocks data 13830 1727204078.06527: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204078.06528: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204078.06533: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204078.07544: done processing included file 13830 1727204078.07547: iterating over new_blocks loaded from include file 13830 1727204078.07548: in VariableManager get_vars() 13830 1727204078.07573: done with get_vars() 13830 1727204078.07574: filtering new block on tags 13830 1727204078.07604: done filtering new block on tags 13830 1727204078.07608: in VariableManager get_vars() 13830 1727204078.07631: done with get_vars() 13830 1727204078.07633: filtering new block on tags 13830 1727204078.07678: done filtering new block on tags 13830 1727204078.07681: in VariableManager get_vars() 13830 1727204078.07702: done with get_vars() 13830 1727204078.07704: filtering new block on tags 13830 1727204078.07747: done filtering new block on tags 13830 1727204078.07750: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 13830 1727204078.07755: extending task lists for all hosts with included blocks 13830 1727204078.09451: done extending task lists 13830 1727204078.09453: done processing included files 13830 1727204078.09453: results queue empty 13830 1727204078.09454: checking for any_errors_fatal 13830 1727204078.09457: done checking for any_errors_fatal 13830 1727204078.09458: checking for max_fail_percentage 13830 1727204078.09459: done checking for max_fail_percentage 13830 1727204078.09460: checking to see if all hosts have failed and the running result is not ok 13830 1727204078.09460: done checking to see if all hosts have failed 13830 1727204078.09461: getting the remaining hosts for this loop 13830 1727204078.09462: done getting the remaining hosts for this loop 13830 1727204078.09466: getting the next task for host managed-node3 13830 1727204078.09471: done getting next task for host managed-node3 13830 1727204078.09473: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204078.09478: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204078.09487: getting variables 13830 1727204078.09488: in VariableManager get_vars() 13830 1727204078.09503: Calling all_inventory to load vars for managed-node3 13830 1727204078.09505: Calling groups_inventory to load vars for managed-node3 13830 1727204078.09507: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204078.09512: Calling all_plugins_play to load vars for managed-node3 13830 1727204078.09514: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204078.09516: Calling groups_plugins_play to load vars for managed-node3 13830 1727204078.09667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204078.09857: done with get_vars() 13830 1727204078.09869: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.055) 0:00:11.177 ***** 13830 1727204078.09942: entering _queue_task() for managed-node3/setup 13830 1727204078.10249: worker is 1 (out of 1 available) 13830 1727204078.10261: exiting _queue_task() for managed-node3/setup 13830 1727204078.10275: done queuing things up, now waiting for results queue to drain 13830 1727204078.10276: waiting for pending results... 13830 1727204078.11204: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204078.11554: in run() - task 0affcd87-79f5-1659-6b02-0000000002d4 13830 1727204078.11579: variable 'ansible_search_path' from source: unknown 13830 1727204078.11587: variable 'ansible_search_path' from source: unknown 13830 1727204078.11631: calling self._execute() 13830 1727204078.11831: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204078.11844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204078.11857: variable 'omit' from source: magic vars 13830 1727204078.12860: variable 'ansible_distribution_major_version' from source: facts 13830 1727204078.12879: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204078.13104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204078.15490: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204078.15556: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204078.15597: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204078.15645: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204078.15673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204078.15755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204078.15784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204078.15812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204078.15856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204078.15871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204078.15925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204078.15951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204078.15976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204078.16015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204078.16037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204078.16190: variable '__network_required_facts' from source: role '' defaults 13830 1727204078.16199: variable 'ansible_facts' from source: unknown 13830 1727204078.16292: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13830 1727204078.16296: when evaluation is False, skipping this task 13830 1727204078.16299: _execute() done 13830 1727204078.16301: dumping result to json 13830 1727204078.16303: done dumping result, returning 13830 1727204078.16312: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1659-6b02-0000000002d4] 13830 1727204078.16317: sending task result for task 0affcd87-79f5-1659-6b02-0000000002d4 13830 1727204078.16419: done sending task result for task 0affcd87-79f5-1659-6b02-0000000002d4 13830 1727204078.16422: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204078.16501: no more pending results, returning what we have 13830 1727204078.16505: results queue empty 13830 1727204078.16505: checking for any_errors_fatal 13830 1727204078.16507: done checking for any_errors_fatal 13830 1727204078.16508: checking for max_fail_percentage 13830 1727204078.16510: done checking for max_fail_percentage 13830 1727204078.16511: checking to see if all hosts have failed and the running result is not ok 13830 1727204078.16512: done checking to see if all hosts have failed 13830 1727204078.16512: getting the remaining hosts for this loop 13830 1727204078.16514: done getting the remaining hosts for this loop 13830 1727204078.16518: getting the next task for host managed-node3 13830 1727204078.16531: done getting next task for host managed-node3 13830 1727204078.16535: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204078.16541: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204078.16556: getting variables 13830 1727204078.16558: in VariableManager get_vars() 13830 1727204078.16598: Calling all_inventory to load vars for managed-node3 13830 1727204078.16601: Calling groups_inventory to load vars for managed-node3 13830 1727204078.16603: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204078.16614: Calling all_plugins_play to load vars for managed-node3 13830 1727204078.16616: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204078.16624: Calling groups_plugins_play to load vars for managed-node3 13830 1727204078.16835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204078.17047: done with get_vars() 13830 1727204078.17059: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.074) 0:00:11.252 ***** 13830 1727204078.17393: entering _queue_task() for managed-node3/stat 13830 1727204078.17657: worker is 1 (out of 1 available) 13830 1727204078.17673: exiting _queue_task() for managed-node3/stat 13830 1727204078.17686: done queuing things up, now waiting for results queue to drain 13830 1727204078.17687: waiting for pending results... 13830 1727204078.17969: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204078.18103: in run() - task 0affcd87-79f5-1659-6b02-0000000002d6 13830 1727204078.18115: variable 'ansible_search_path' from source: unknown 13830 1727204078.18119: variable 'ansible_search_path' from source: unknown 13830 1727204078.18159: calling self._execute() 13830 1727204078.18248: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204078.18251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204078.18262: variable 'omit' from source: magic vars 13830 1727204078.18628: variable 'ansible_distribution_major_version' from source: facts 13830 1727204078.18644: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204078.18817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204078.19085: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204078.19127: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204078.19162: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204078.19193: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204078.19278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204078.19304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204078.19337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204078.19362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204078.19454: variable '__network_is_ostree' from source: set_fact 13830 1727204078.19461: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204078.19466: when evaluation is False, skipping this task 13830 1727204078.19470: _execute() done 13830 1727204078.19472: dumping result to json 13830 1727204078.19475: done dumping result, returning 13830 1727204078.19481: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1659-6b02-0000000002d6] 13830 1727204078.19487: sending task result for task 0affcd87-79f5-1659-6b02-0000000002d6 13830 1727204078.19577: done sending task result for task 0affcd87-79f5-1659-6b02-0000000002d6 13830 1727204078.19580: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204078.19642: no more pending results, returning what we have 13830 1727204078.19646: results queue empty 13830 1727204078.19647: checking for any_errors_fatal 13830 1727204078.19656: done checking for any_errors_fatal 13830 1727204078.19657: checking for max_fail_percentage 13830 1727204078.19660: done checking for max_fail_percentage 13830 1727204078.19661: checking to see if all hosts have failed and the running result is not ok 13830 1727204078.19662: done checking to see if all hosts have failed 13830 1727204078.19662: getting the remaining hosts for this loop 13830 1727204078.19667: done getting the remaining hosts for this loop 13830 1727204078.19671: getting the next task for host managed-node3 13830 1727204078.19679: done getting next task for host managed-node3 13830 1727204078.19683: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204078.19689: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204078.19703: getting variables 13830 1727204078.19705: in VariableManager get_vars() 13830 1727204078.19746: Calling all_inventory to load vars for managed-node3 13830 1727204078.19749: Calling groups_inventory to load vars for managed-node3 13830 1727204078.19752: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204078.19762: Calling all_plugins_play to load vars for managed-node3 13830 1727204078.19767: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204078.19770: Calling groups_plugins_play to load vars for managed-node3 13830 1727204078.19950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204078.20169: done with get_vars() 13830 1727204078.20180: done getting variables 13830 1727204078.20247: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.030) 0:00:11.282 ***** 13830 1727204078.20473: entering _queue_task() for managed-node3/set_fact 13830 1727204078.20722: worker is 1 (out of 1 available) 13830 1727204078.20738: exiting _queue_task() for managed-node3/set_fact 13830 1727204078.20751: done queuing things up, now waiting for results queue to drain 13830 1727204078.20753: waiting for pending results... 13830 1727204078.21698: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204078.21943: in run() - task 0affcd87-79f5-1659-6b02-0000000002d7 13830 1727204078.22074: variable 'ansible_search_path' from source: unknown 13830 1727204078.22078: variable 'ansible_search_path' from source: unknown 13830 1727204078.22113: calling self._execute() 13830 1727204078.22311: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204078.22315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204078.22325: variable 'omit' from source: magic vars 13830 1727204078.23299: variable 'ansible_distribution_major_version' from source: facts 13830 1727204078.23310: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204078.23621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204078.23893: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204078.23942: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204078.23974: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204078.24005: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204078.24097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204078.24124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204078.24153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204078.24184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204078.24280: variable '__network_is_ostree' from source: set_fact 13830 1727204078.24287: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204078.24290: when evaluation is False, skipping this task 13830 1727204078.24293: _execute() done 13830 1727204078.24296: dumping result to json 13830 1727204078.24298: done dumping result, returning 13830 1727204078.24307: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1659-6b02-0000000002d7] 13830 1727204078.24312: sending task result for task 0affcd87-79f5-1659-6b02-0000000002d7 13830 1727204078.24414: done sending task result for task 0affcd87-79f5-1659-6b02-0000000002d7 13830 1727204078.24417: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204078.24498: no more pending results, returning what we have 13830 1727204078.24502: results queue empty 13830 1727204078.24502: checking for any_errors_fatal 13830 1727204078.24508: done checking for any_errors_fatal 13830 1727204078.24508: checking for max_fail_percentage 13830 1727204078.24510: done checking for max_fail_percentage 13830 1727204078.24511: checking to see if all hosts have failed and the running result is not ok 13830 1727204078.24512: done checking to see if all hosts have failed 13830 1727204078.24513: getting the remaining hosts for this loop 13830 1727204078.24515: done getting the remaining hosts for this loop 13830 1727204078.24519: getting the next task for host managed-node3 13830 1727204078.24533: done getting next task for host managed-node3 13830 1727204078.24537: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204078.24543: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204078.24558: getting variables 13830 1727204078.24560: in VariableManager get_vars() 13830 1727204078.24599: Calling all_inventory to load vars for managed-node3 13830 1727204078.24602: Calling groups_inventory to load vars for managed-node3 13830 1727204078.24605: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204078.24615: Calling all_plugins_play to load vars for managed-node3 13830 1727204078.24617: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204078.24620: Calling groups_plugins_play to load vars for managed-node3 13830 1727204078.24852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204078.25245: done with get_vars() 13830 1727204078.25257: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.048) 0:00:11.331 ***** 13830 1727204078.25354: entering _queue_task() for managed-node3/service_facts 13830 1727204078.25356: Creating lock for service_facts 13830 1727204078.25628: worker is 1 (out of 1 available) 13830 1727204078.25642: exiting _queue_task() for managed-node3/service_facts 13830 1727204078.25653: done queuing things up, now waiting for results queue to drain 13830 1727204078.25654: waiting for pending results... 13830 1727204078.25926: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204078.26067: in run() - task 0affcd87-79f5-1659-6b02-0000000002d9 13830 1727204078.26080: variable 'ansible_search_path' from source: unknown 13830 1727204078.26083: variable 'ansible_search_path' from source: unknown 13830 1727204078.26121: calling self._execute() 13830 1727204078.26199: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204078.26210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204078.26221: variable 'omit' from source: magic vars 13830 1727204078.26581: variable 'ansible_distribution_major_version' from source: facts 13830 1727204078.26593: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204078.26599: variable 'omit' from source: magic vars 13830 1727204078.26686: variable 'omit' from source: magic vars 13830 1727204078.26716: variable 'omit' from source: magic vars 13830 1727204078.26764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204078.26800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204078.26821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204078.26841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204078.26851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204078.26886: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204078.26889: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204078.26891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204078.26993: Set connection var ansible_connection to ssh 13830 1727204078.27003: Set connection var ansible_timeout to 10 13830 1727204078.27009: Set connection var ansible_shell_executable to /bin/sh 13830 1727204078.27012: Set connection var ansible_shell_type to sh 13830 1727204078.27017: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204078.27027: Set connection var ansible_pipelining to False 13830 1727204078.27053: variable 'ansible_shell_executable' from source: unknown 13830 1727204078.27056: variable 'ansible_connection' from source: unknown 13830 1727204078.27059: variable 'ansible_module_compression' from source: unknown 13830 1727204078.27061: variable 'ansible_shell_type' from source: unknown 13830 1727204078.27065: variable 'ansible_shell_executable' from source: unknown 13830 1727204078.27068: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204078.27070: variable 'ansible_pipelining' from source: unknown 13830 1727204078.27072: variable 'ansible_timeout' from source: unknown 13830 1727204078.27081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204078.27276: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204078.27286: variable 'omit' from source: magic vars 13830 1727204078.27291: starting attempt loop 13830 1727204078.27297: running the handler 13830 1727204078.27311: _low_level_execute_command(): starting 13830 1727204078.27319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204078.28101: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204078.28113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.28123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.28140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.28186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204078.28195: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204078.28209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.28222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204078.28231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204078.28242: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204078.28250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.28259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.28274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.28281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204078.28289: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204078.28299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.28378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204078.28399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204078.28413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204078.28496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204078.30279: stdout chunk (state=3): >>>/root <<< 13830 1727204078.30471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204078.30475: stdout chunk (state=3): >>><<< 13830 1727204078.30478: stderr chunk (state=3): >>><<< 13830 1727204078.30482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204078.30484: _low_level_execute_command(): starting 13830 1727204078.30487: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002 `" && echo ansible-tmp-1727204078.30355-14869-102084178120002="` echo /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002 `" ) && sleep 0' 13830 1727204078.32653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.32657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.32704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.32708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.32725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204078.32731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.32812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204078.32818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204078.32838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204078.33082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204078.34739: stdout chunk (state=3): >>>ansible-tmp-1727204078.30355-14869-102084178120002=/root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002 <<< 13830 1727204078.34870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204078.34927: stderr chunk (state=3): >>><<< 13830 1727204078.34930: stdout chunk (state=3): >>><<< 13830 1727204078.34953: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204078.30355-14869-102084178120002=/root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204078.35006: variable 'ansible_module_compression' from source: unknown 13830 1727204078.35056: ANSIBALLZ: Using lock for service_facts 13830 1727204078.35059: ANSIBALLZ: Acquiring lock 13830 1727204078.35062: ANSIBALLZ: Lock acquired: 140043657050448 13830 1727204078.35066: ANSIBALLZ: Creating module 13830 1727204078.62503: ANSIBALLZ: Writing module into payload 13830 1727204078.62625: ANSIBALLZ: Writing module 13830 1727204078.62653: ANSIBALLZ: Renaming module 13830 1727204078.62656: ANSIBALLZ: Done creating module 13830 1727204078.62683: variable 'ansible_facts' from source: unknown 13830 1727204078.62759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002/AnsiballZ_service_facts.py 13830 1727204078.62921: Sending initial data 13830 1727204078.62924: Sent initial data (160 bytes) 13830 1727204078.63959: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204078.63972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.63994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.64008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.64049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204078.64056: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204078.64067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.64081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204078.64091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204078.64104: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204078.64111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.64121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.64137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.64145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204078.64151: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204078.64161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.64243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204078.64262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204078.64278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204078.64353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204078.66112: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204078.66152: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204078.66219: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp3_cfabsw /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002/AnsiballZ_service_facts.py <<< 13830 1727204078.66226: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204078.67796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204078.68190: stderr chunk (state=3): >>><<< 13830 1727204078.68193: stdout chunk (state=3): >>><<< 13830 1727204078.68212: done transferring module to remote 13830 1727204078.68223: _low_level_execute_command(): starting 13830 1727204078.68228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002/ /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002/AnsiballZ_service_facts.py && sleep 0' 13830 1727204078.69151: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.69155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.69868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204078.69873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204078.69929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.69938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204078.69978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.70081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204078.70345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204078.70490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204078.72184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204078.72188: stderr chunk (state=3): >>><<< 13830 1727204078.72190: stdout chunk (state=3): >>><<< 13830 1727204078.72209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204078.72213: _low_level_execute_command(): starting 13830 1727204078.72219: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002/AnsiballZ_service_facts.py && sleep 0' 13830 1727204078.73321: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204078.73336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.73350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.73370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.73414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204078.73426: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204078.73440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.73456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204078.73474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204078.73488: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204078.73500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204078.73512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204078.73526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204078.73537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204078.73547: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204078.73559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204078.74331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204078.74348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204078.74362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204078.74444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204080.02162: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 13830 1727204080.02181: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 13830 1727204080.02186: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 13830 1727204080.02214: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed<<< 13830 1727204080.02220: stdout chunk (state=3): >>>.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13830 1727204080.03483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204080.03574: stderr chunk (state=3): >>><<< 13830 1727204080.03577: stdout chunk (state=3): >>><<< 13830 1727204080.03779: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204080.04417: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204080.04443: _low_level_execute_command(): starting 13830 1727204080.04453: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204078.30355-14869-102084178120002/ > /dev/null 2>&1 && sleep 0' 13830 1727204080.05172: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204080.05192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.05209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.05233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.05283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.05304: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204080.05321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.05346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204080.05359: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204080.05373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204080.05386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.05398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.05418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.05433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.05444: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204080.05463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.05546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204080.05573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204080.05589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204080.05661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204080.07537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204080.07645: stderr chunk (state=3): >>><<< 13830 1727204080.07657: stdout chunk (state=3): >>><<< 13830 1727204080.08011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204080.08015: handler run complete 13830 1727204080.08017: variable 'ansible_facts' from source: unknown 13830 1727204080.08089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204080.08878: variable 'ansible_facts' from source: unknown 13830 1727204080.09161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204080.09771: attempt loop complete, returning result 13830 1727204080.09855: _execute() done 13830 1727204080.09862: dumping result to json 13830 1727204080.09922: done dumping result, returning 13830 1727204080.09969: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1659-6b02-0000000002d9] 13830 1727204080.10038: sending task result for task 0affcd87-79f5-1659-6b02-0000000002d9 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204080.11335: no more pending results, returning what we have 13830 1727204080.11338: results queue empty 13830 1727204080.11339: checking for any_errors_fatal 13830 1727204080.11343: done checking for any_errors_fatal 13830 1727204080.11344: checking for max_fail_percentage 13830 1727204080.11345: done checking for max_fail_percentage 13830 1727204080.11346: checking to see if all hosts have failed and the running result is not ok 13830 1727204080.11347: done checking to see if all hosts have failed 13830 1727204080.11348: getting the remaining hosts for this loop 13830 1727204080.11349: done getting the remaining hosts for this loop 13830 1727204080.11354: getting the next task for host managed-node3 13830 1727204080.11361: done getting next task for host managed-node3 13830 1727204080.11367: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204080.11373: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204080.11383: getting variables 13830 1727204080.11385: in VariableManager get_vars() 13830 1727204080.11418: Calling all_inventory to load vars for managed-node3 13830 1727204080.11421: Calling groups_inventory to load vars for managed-node3 13830 1727204080.11423: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204080.11436: Calling all_plugins_play to load vars for managed-node3 13830 1727204080.11439: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204080.11442: Calling groups_plugins_play to load vars for managed-node3 13830 1727204080.11795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204080.13173: done sending task result for task 0affcd87-79f5-1659-6b02-0000000002d9 13830 1727204080.13177: WORKER PROCESS EXITING 13830 1727204080.13375: done with get_vars() 13830 1727204080.13389: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:40 -0400 (0:00:01.881) 0:00:13.212 ***** 13830 1727204080.13488: entering _queue_task() for managed-node3/package_facts 13830 1727204080.13490: Creating lock for package_facts 13830 1727204080.13932: worker is 1 (out of 1 available) 13830 1727204080.13944: exiting _queue_task() for managed-node3/package_facts 13830 1727204080.13957: done queuing things up, now waiting for results queue to drain 13830 1727204080.13959: waiting for pending results... 13830 1727204080.14237: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204080.14394: in run() - task 0affcd87-79f5-1659-6b02-0000000002da 13830 1727204080.14421: variable 'ansible_search_path' from source: unknown 13830 1727204080.14432: variable 'ansible_search_path' from source: unknown 13830 1727204080.14473: calling self._execute() 13830 1727204080.14563: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204080.14577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204080.14590: variable 'omit' from source: magic vars 13830 1727204080.14989: variable 'ansible_distribution_major_version' from source: facts 13830 1727204080.15007: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204080.15016: variable 'omit' from source: magic vars 13830 1727204080.15103: variable 'omit' from source: magic vars 13830 1727204080.15143: variable 'omit' from source: magic vars 13830 1727204080.15194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204080.15238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204080.15267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204080.15290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204080.15305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204080.15341: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204080.15349: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204080.15356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204080.15459: Set connection var ansible_connection to ssh 13830 1727204080.15477: Set connection var ansible_timeout to 10 13830 1727204080.15490: Set connection var ansible_shell_executable to /bin/sh 13830 1727204080.15496: Set connection var ansible_shell_type to sh 13830 1727204080.15505: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204080.15517: Set connection var ansible_pipelining to False 13830 1727204080.15546: variable 'ansible_shell_executable' from source: unknown 13830 1727204080.15553: variable 'ansible_connection' from source: unknown 13830 1727204080.15560: variable 'ansible_module_compression' from source: unknown 13830 1727204080.15568: variable 'ansible_shell_type' from source: unknown 13830 1727204080.15574: variable 'ansible_shell_executable' from source: unknown 13830 1727204080.15580: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204080.15587: variable 'ansible_pipelining' from source: unknown 13830 1727204080.15598: variable 'ansible_timeout' from source: unknown 13830 1727204080.15606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204080.15816: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204080.15836: variable 'omit' from source: magic vars 13830 1727204080.15846: starting attempt loop 13830 1727204080.15852: running the handler 13830 1727204080.15872: _low_level_execute_command(): starting 13830 1727204080.15884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204080.16756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204080.16773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.16792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.16811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.16859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.16910: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204080.16982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.17001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204080.17016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204080.17027: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204080.17042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.17055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.17076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.17089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.17100: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204080.17117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.17259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204080.17287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204080.17302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204080.17409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204080.18975: stdout chunk (state=3): >>>/root <<< 13830 1727204080.19074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204080.19163: stderr chunk (state=3): >>><<< 13830 1727204080.19169: stdout chunk (state=3): >>><<< 13830 1727204080.19290: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204080.19293: _low_level_execute_command(): starting 13830 1727204080.19296: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124 `" && echo ansible-tmp-1727204080.1919146-14941-7439512834124="` echo /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124 `" ) && sleep 0' 13830 1727204080.20769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.20773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.20921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.20924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.20927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.20985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204080.21087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204080.21091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204080.21145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204080.22960: stdout chunk (state=3): >>>ansible-tmp-1727204080.1919146-14941-7439512834124=/root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124 <<< 13830 1727204080.23082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204080.23158: stderr chunk (state=3): >>><<< 13830 1727204080.23161: stdout chunk (state=3): >>><<< 13830 1727204080.23183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204080.1919146-14941-7439512834124=/root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204080.23231: variable 'ansible_module_compression' from source: unknown 13830 1727204080.23290: ANSIBALLZ: Using lock for package_facts 13830 1727204080.23294: ANSIBALLZ: Acquiring lock 13830 1727204080.23297: ANSIBALLZ: Lock acquired: 140043654790272 13830 1727204080.23299: ANSIBALLZ: Creating module 13830 1727204080.67027: ANSIBALLZ: Writing module into payload 13830 1727204080.67145: ANSIBALLZ: Writing module 13830 1727204080.67175: ANSIBALLZ: Renaming module 13830 1727204080.67179: ANSIBALLZ: Done creating module 13830 1727204080.67210: variable 'ansible_facts' from source: unknown 13830 1727204080.67350: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124/AnsiballZ_package_facts.py 13830 1727204080.67558: Sending initial data 13830 1727204080.67561: Sent initial data (160 bytes) 13830 1727204080.70936: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.70944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.70987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.70993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.71014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.71020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.71193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204080.71214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204080.71342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204080.73109: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204080.73142: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204080.73179: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpyaqo_4re /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124/AnsiballZ_package_facts.py <<< 13830 1727204080.73214: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204080.76087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204080.76363: stderr chunk (state=3): >>><<< 13830 1727204080.76369: stdout chunk (state=3): >>><<< 13830 1727204080.76371: done transferring module to remote 13830 1727204080.76373: _low_level_execute_command(): starting 13830 1727204080.76376: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124/ /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124/AnsiballZ_package_facts.py && sleep 0' 13830 1727204080.77493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204080.77501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.77511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.77524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.77569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.77576: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204080.77587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.77599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204080.77606: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204080.77614: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204080.77622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.77631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.77645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.77652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.77658: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204080.77669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.77744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204080.77761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204080.77775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204080.77847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204080.79627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204080.79631: stdout chunk (state=3): >>><<< 13830 1727204080.79634: stderr chunk (state=3): >>><<< 13830 1727204080.79670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204080.79674: _low_level_execute_command(): starting 13830 1727204080.79677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124/AnsiballZ_package_facts.py && sleep 0' 13830 1727204080.80963: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204080.80972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.80983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.80999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.81040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.81047: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204080.81056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.81073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204080.81081: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204080.81087: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204080.81095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204080.81102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204080.81114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204080.81120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204080.81127: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204080.81139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204080.81223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204080.81226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204080.81234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204080.81315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204081.26966: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "ve<<< 13830 1727204081.27036: stdout chunk (state=3): >>>rsion": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils<<< 13830 1727204081.27045: stdout chunk (state=3): >>>", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "<<< 13830 1727204081.27056: stdout chunk (state=3): >>>release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "r<<< 13830 1727204081.27061: stdout chunk (state=3): >>>pm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86<<< 13830 1727204081.27069: stdout chunk (state=3): >>>_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "relea<<< 13830 1727204081.27072: stdout chunk (state=3): >>>se": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gob<<< 13830 1727204081.27079: stdout chunk (state=3): >>>ject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "no<<< 13830 1727204081.27094: stdout chunk (state=3): >>>arch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1<<< 13830 1727204081.27102: stdout chunk (state=3): >>>", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "ver<<< 13830 1727204081.27129: stdout chunk (state=3): >>>sion": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-l<<< 13830 1727204081.27145: stdout chunk (state=3): >>>ibs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 13830 1727204081.27162: stdout chunk (state=3): >>>gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch":<<< 13830 1727204081.27169: stdout chunk (state=3): >>> null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13830 1727204081.28682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204081.28743: stderr chunk (state=3): >>><<< 13830 1727204081.28750: stdout chunk (state=3): >>><<< 13830 1727204081.28789: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204081.30480: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204081.30499: _low_level_execute_command(): starting 13830 1727204081.30503: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204080.1919146-14941-7439512834124/ > /dev/null 2>&1 && sleep 0' 13830 1727204081.30981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204081.30991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204081.31026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204081.31043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204081.31088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204081.31100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204081.31157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204081.32926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204081.32979: stderr chunk (state=3): >>><<< 13830 1727204081.32983: stdout chunk (state=3): >>><<< 13830 1727204081.32996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204081.33002: handler run complete 13830 1727204081.33500: variable 'ansible_facts' from source: unknown 13830 1727204081.33772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.34990: variable 'ansible_facts' from source: unknown 13830 1727204081.35246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.35686: attempt loop complete, returning result 13830 1727204081.35696: _execute() done 13830 1727204081.35699: dumping result to json 13830 1727204081.35823: done dumping result, returning 13830 1727204081.35835: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1659-6b02-0000000002da] 13830 1727204081.35840: sending task result for task 0affcd87-79f5-1659-6b02-0000000002da 13830 1727204081.37144: done sending task result for task 0affcd87-79f5-1659-6b02-0000000002da 13830 1727204081.37147: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204081.37192: no more pending results, returning what we have 13830 1727204081.37195: results queue empty 13830 1727204081.37195: checking for any_errors_fatal 13830 1727204081.37199: done checking for any_errors_fatal 13830 1727204081.37199: checking for max_fail_percentage 13830 1727204081.37200: done checking for max_fail_percentage 13830 1727204081.37201: checking to see if all hosts have failed and the running result is not ok 13830 1727204081.37202: done checking to see if all hosts have failed 13830 1727204081.37202: getting the remaining hosts for this loop 13830 1727204081.37203: done getting the remaining hosts for this loop 13830 1727204081.37206: getting the next task for host managed-node3 13830 1727204081.37211: done getting next task for host managed-node3 13830 1727204081.37214: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204081.37218: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204081.37224: getting variables 13830 1727204081.37225: in VariableManager get_vars() 13830 1727204081.37248: Calling all_inventory to load vars for managed-node3 13830 1727204081.37250: Calling groups_inventory to load vars for managed-node3 13830 1727204081.37252: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204081.37258: Calling all_plugins_play to load vars for managed-node3 13830 1727204081.37260: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204081.37262: Calling groups_plugins_play to load vars for managed-node3 13830 1727204081.37972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.38990: done with get_vars() 13830 1727204081.39006: done getting variables 13830 1727204081.39053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:41 -0400 (0:00:01.255) 0:00:14.468 ***** 13830 1727204081.39084: entering _queue_task() for managed-node3/debug 13830 1727204081.39303: worker is 1 (out of 1 available) 13830 1727204081.39316: exiting _queue_task() for managed-node3/debug 13830 1727204081.39328: done queuing things up, now waiting for results queue to drain 13830 1727204081.39332: waiting for pending results... 13830 1727204081.39508: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204081.39596: in run() - task 0affcd87-79f5-1659-6b02-000000000278 13830 1727204081.39609: variable 'ansible_search_path' from source: unknown 13830 1727204081.39613: variable 'ansible_search_path' from source: unknown 13830 1727204081.39642: calling self._execute() 13830 1727204081.39720: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.39735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.39747: variable 'omit' from source: magic vars 13830 1727204081.40099: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.40118: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204081.40128: variable 'omit' from source: magic vars 13830 1727204081.40190: variable 'omit' from source: magic vars 13830 1727204081.40292: variable 'network_provider' from source: set_fact 13830 1727204081.40313: variable 'omit' from source: magic vars 13830 1727204081.40356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204081.40398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204081.40426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204081.40447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204081.40462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204081.40496: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204081.40505: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.40514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.40611: Set connection var ansible_connection to ssh 13830 1727204081.40630: Set connection var ansible_timeout to 10 13830 1727204081.40641: Set connection var ansible_shell_executable to /bin/sh 13830 1727204081.40648: Set connection var ansible_shell_type to sh 13830 1727204081.40656: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204081.40672: Set connection var ansible_pipelining to False 13830 1727204081.40699: variable 'ansible_shell_executable' from source: unknown 13830 1727204081.40708: variable 'ansible_connection' from source: unknown 13830 1727204081.40715: variable 'ansible_module_compression' from source: unknown 13830 1727204081.40724: variable 'ansible_shell_type' from source: unknown 13830 1727204081.40733: variable 'ansible_shell_executable' from source: unknown 13830 1727204081.40741: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.40749: variable 'ansible_pipelining' from source: unknown 13830 1727204081.40756: variable 'ansible_timeout' from source: unknown 13830 1727204081.40767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.40913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204081.40930: variable 'omit' from source: magic vars 13830 1727204081.40942: starting attempt loop 13830 1727204081.40952: running the handler 13830 1727204081.41002: handler run complete 13830 1727204081.41021: attempt loop complete, returning result 13830 1727204081.41028: _execute() done 13830 1727204081.41035: dumping result to json 13830 1727204081.41041: done dumping result, returning 13830 1727204081.41052: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1659-6b02-000000000278] 13830 1727204081.41065: sending task result for task 0affcd87-79f5-1659-6b02-000000000278 ok: [managed-node3] => {} MSG: Using network provider: nm 13830 1727204081.41244: no more pending results, returning what we have 13830 1727204081.41248: results queue empty 13830 1727204081.41249: checking for any_errors_fatal 13830 1727204081.41259: done checking for any_errors_fatal 13830 1727204081.41260: checking for max_fail_percentage 13830 1727204081.41261: done checking for max_fail_percentage 13830 1727204081.41262: checking to see if all hosts have failed and the running result is not ok 13830 1727204081.41263: done checking to see if all hosts have failed 13830 1727204081.41265: getting the remaining hosts for this loop 13830 1727204081.41267: done getting the remaining hosts for this loop 13830 1727204081.41271: getting the next task for host managed-node3 13830 1727204081.41278: done getting next task for host managed-node3 13830 1727204081.41282: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204081.41287: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204081.41297: getting variables 13830 1727204081.41298: in VariableManager get_vars() 13830 1727204081.41332: Calling all_inventory to load vars for managed-node3 13830 1727204081.41334: Calling groups_inventory to load vars for managed-node3 13830 1727204081.41337: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204081.41345: Calling all_plugins_play to load vars for managed-node3 13830 1727204081.41350: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204081.41353: Calling groups_plugins_play to load vars for managed-node3 13830 1727204081.42028: done sending task result for task 0affcd87-79f5-1659-6b02-000000000278 13830 1727204081.42034: WORKER PROCESS EXITING 13830 1727204081.42199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.43231: done with get_vars() 13830 1727204081.43256: done getting variables 13830 1727204081.43350: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.043) 0:00:14.512 ***** 13830 1727204081.43391: entering _queue_task() for managed-node3/fail 13830 1727204081.43393: Creating lock for fail 13830 1727204081.43704: worker is 1 (out of 1 available) 13830 1727204081.43718: exiting _queue_task() for managed-node3/fail 13830 1727204081.43731: done queuing things up, now waiting for results queue to drain 13830 1727204081.43733: waiting for pending results... 13830 1727204081.44022: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204081.44156: in run() - task 0affcd87-79f5-1659-6b02-000000000279 13830 1727204081.44179: variable 'ansible_search_path' from source: unknown 13830 1727204081.44186: variable 'ansible_search_path' from source: unknown 13830 1727204081.44224: calling self._execute() 13830 1727204081.44314: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.44324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.44335: variable 'omit' from source: magic vars 13830 1727204081.44701: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.44724: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204081.44850: variable 'network_state' from source: role '' defaults 13830 1727204081.44866: Evaluated conditional (network_state != {}): False 13830 1727204081.44874: when evaluation is False, skipping this task 13830 1727204081.44880: _execute() done 13830 1727204081.44887: dumping result to json 13830 1727204081.44893: done dumping result, returning 13830 1727204081.44902: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1659-6b02-000000000279] 13830 1727204081.44913: sending task result for task 0affcd87-79f5-1659-6b02-000000000279 13830 1727204081.45026: done sending task result for task 0affcd87-79f5-1659-6b02-000000000279 13830 1727204081.45035: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204081.45087: no more pending results, returning what we have 13830 1727204081.45092: results queue empty 13830 1727204081.45093: checking for any_errors_fatal 13830 1727204081.45100: done checking for any_errors_fatal 13830 1727204081.45100: checking for max_fail_percentage 13830 1727204081.45102: done checking for max_fail_percentage 13830 1727204081.45103: checking to see if all hosts have failed and the running result is not ok 13830 1727204081.45104: done checking to see if all hosts have failed 13830 1727204081.45104: getting the remaining hosts for this loop 13830 1727204081.45106: done getting the remaining hosts for this loop 13830 1727204081.45111: getting the next task for host managed-node3 13830 1727204081.45119: done getting next task for host managed-node3 13830 1727204081.45123: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204081.45129: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204081.45145: getting variables 13830 1727204081.45147: in VariableManager get_vars() 13830 1727204081.45185: Calling all_inventory to load vars for managed-node3 13830 1727204081.45188: Calling groups_inventory to load vars for managed-node3 13830 1727204081.45191: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204081.45202: Calling all_plugins_play to load vars for managed-node3 13830 1727204081.45205: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204081.45208: Calling groups_plugins_play to load vars for managed-node3 13830 1727204081.47082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.48968: done with get_vars() 13830 1727204081.48999: done getting variables 13830 1727204081.49060: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.057) 0:00:14.569 ***** 13830 1727204081.49100: entering _queue_task() for managed-node3/fail 13830 1727204081.49407: worker is 1 (out of 1 available) 13830 1727204081.49419: exiting _queue_task() for managed-node3/fail 13830 1727204081.49431: done queuing things up, now waiting for results queue to drain 13830 1727204081.49433: waiting for pending results... 13830 1727204081.49778: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204081.49957: in run() - task 0affcd87-79f5-1659-6b02-00000000027a 13830 1727204081.49979: variable 'ansible_search_path' from source: unknown 13830 1727204081.49988: variable 'ansible_search_path' from source: unknown 13830 1727204081.50085: calling self._execute() 13830 1727204081.50194: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.50206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.50230: variable 'omit' from source: magic vars 13830 1727204081.50611: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.50630: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204081.50756: variable 'network_state' from source: role '' defaults 13830 1727204081.50776: Evaluated conditional (network_state != {}): False 13830 1727204081.50786: when evaluation is False, skipping this task 13830 1727204081.50793: _execute() done 13830 1727204081.50799: dumping result to json 13830 1727204081.50806: done dumping result, returning 13830 1727204081.50818: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1659-6b02-00000000027a] 13830 1727204081.50829: sending task result for task 0affcd87-79f5-1659-6b02-00000000027a skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204081.50981: no more pending results, returning what we have 13830 1727204081.50986: results queue empty 13830 1727204081.50987: checking for any_errors_fatal 13830 1727204081.50994: done checking for any_errors_fatal 13830 1727204081.50995: checking for max_fail_percentage 13830 1727204081.50997: done checking for max_fail_percentage 13830 1727204081.50998: checking to see if all hosts have failed and the running result is not ok 13830 1727204081.50999: done checking to see if all hosts have failed 13830 1727204081.51000: getting the remaining hosts for this loop 13830 1727204081.51002: done getting the remaining hosts for this loop 13830 1727204081.51007: getting the next task for host managed-node3 13830 1727204081.51015: done getting next task for host managed-node3 13830 1727204081.51020: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204081.51027: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204081.51042: getting variables 13830 1727204081.51045: in VariableManager get_vars() 13830 1727204081.51085: Calling all_inventory to load vars for managed-node3 13830 1727204081.51088: Calling groups_inventory to load vars for managed-node3 13830 1727204081.51091: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204081.51102: Calling all_plugins_play to load vars for managed-node3 13830 1727204081.51105: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204081.51108: Calling groups_plugins_play to load vars for managed-node3 13830 1727204081.52775: done sending task result for task 0affcd87-79f5-1659-6b02-00000000027a 13830 1727204081.52780: WORKER PROCESS EXITING 13830 1727204081.53344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.55027: done with get_vars() 13830 1727204081.55057: done getting variables 13830 1727204081.55119: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.060) 0:00:14.629 ***** 13830 1727204081.55155: entering _queue_task() for managed-node3/fail 13830 1727204081.55448: worker is 1 (out of 1 available) 13830 1727204081.55460: exiting _queue_task() for managed-node3/fail 13830 1727204081.55474: done queuing things up, now waiting for results queue to drain 13830 1727204081.55475: waiting for pending results... 13830 1727204081.55744: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204081.55894: in run() - task 0affcd87-79f5-1659-6b02-00000000027b 13830 1727204081.55918: variable 'ansible_search_path' from source: unknown 13830 1727204081.55926: variable 'ansible_search_path' from source: unknown 13830 1727204081.55967: calling self._execute() 13830 1727204081.56058: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.56071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.56085: variable 'omit' from source: magic vars 13830 1727204081.56441: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.56463: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204081.56639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204081.59339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204081.59415: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204081.59457: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204081.59624: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204081.59656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204081.59853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204081.59890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204081.59921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204081.59973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204081.60068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204081.60280: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.60300: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13830 1727204081.60334: when evaluation is False, skipping this task 13830 1727204081.60341: _execute() done 13830 1727204081.60349: dumping result to json 13830 1727204081.60357: done dumping result, returning 13830 1727204081.60379: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1659-6b02-00000000027b] 13830 1727204081.60413: sending task result for task 0affcd87-79f5-1659-6b02-00000000027b skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13830 1727204081.60588: no more pending results, returning what we have 13830 1727204081.60592: results queue empty 13830 1727204081.60593: checking for any_errors_fatal 13830 1727204081.60599: done checking for any_errors_fatal 13830 1727204081.60599: checking for max_fail_percentage 13830 1727204081.60602: done checking for max_fail_percentage 13830 1727204081.60603: checking to see if all hosts have failed and the running result is not ok 13830 1727204081.60604: done checking to see if all hosts have failed 13830 1727204081.60604: getting the remaining hosts for this loop 13830 1727204081.60606: done getting the remaining hosts for this loop 13830 1727204081.60610: getting the next task for host managed-node3 13830 1727204081.60619: done getting next task for host managed-node3 13830 1727204081.60623: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204081.60629: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204081.60645: getting variables 13830 1727204081.60647: in VariableManager get_vars() 13830 1727204081.60686: Calling all_inventory to load vars for managed-node3 13830 1727204081.60689: Calling groups_inventory to load vars for managed-node3 13830 1727204081.60692: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204081.60702: Calling all_plugins_play to load vars for managed-node3 13830 1727204081.60705: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204081.60708: Calling groups_plugins_play to load vars for managed-node3 13830 1727204081.62327: done sending task result for task 0affcd87-79f5-1659-6b02-00000000027b 13830 1727204081.62331: WORKER PROCESS EXITING 13830 1727204081.63359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.66752: done with get_vars() 13830 1727204081.66785: done getting variables 13830 1727204081.66886: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.117) 0:00:14.747 ***** 13830 1727204081.66919: entering _queue_task() for managed-node3/dnf 13830 1727204081.67310: worker is 1 (out of 1 available) 13830 1727204081.67322: exiting _queue_task() for managed-node3/dnf 13830 1727204081.67334: done queuing things up, now waiting for results queue to drain 13830 1727204081.67336: waiting for pending results... 13830 1727204081.68387: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204081.68705: in run() - task 0affcd87-79f5-1659-6b02-00000000027c 13830 1727204081.68728: variable 'ansible_search_path' from source: unknown 13830 1727204081.68736: variable 'ansible_search_path' from source: unknown 13830 1727204081.68783: calling self._execute() 13830 1727204081.68875: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.68888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.68903: variable 'omit' from source: magic vars 13830 1727204081.69405: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.69424: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204081.69614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204081.74197: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204081.74280: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204081.74432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204081.74468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204081.74494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204081.74783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204081.74811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204081.74840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204081.74996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204081.75009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204081.75267: variable 'ansible_distribution' from source: facts 13830 1727204081.75271: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.75374: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13830 1727204081.75699: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204081.75955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204081.75980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204081.76005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204081.76161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204081.76177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204081.76216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204081.76241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204081.76382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204081.76421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204081.76438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204081.76593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204081.76617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204081.76645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204081.76682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204081.76812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204081.77087: variable 'network_connections' from source: include params 13830 1727204081.77099: variable 'controller_profile' from source: play vars 13830 1727204081.77290: variable 'controller_profile' from source: play vars 13830 1727204081.77299: variable 'controller_device' from source: play vars 13830 1727204081.77481: variable 'controller_device' from source: play vars 13830 1727204081.77495: variable 'port1_profile' from source: play vars 13830 1727204081.77558: variable 'port1_profile' from source: play vars 13830 1727204081.77681: variable 'dhcp_interface1' from source: play vars 13830 1727204081.77743: variable 'dhcp_interface1' from source: play vars 13830 1727204081.77746: variable 'controller_profile' from source: play vars 13830 1727204081.78705: variable 'controller_profile' from source: play vars 13830 1727204081.78711: variable 'port2_profile' from source: play vars 13830 1727204081.78777: variable 'port2_profile' from source: play vars 13830 1727204081.78784: variable 'dhcp_interface2' from source: play vars 13830 1727204081.78847: variable 'dhcp_interface2' from source: play vars 13830 1727204081.78854: variable 'controller_profile' from source: play vars 13830 1727204081.78914: variable 'controller_profile' from source: play vars 13830 1727204081.78991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204081.80376: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204081.80424: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204081.80459: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204081.80489: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204081.80540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204081.80585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204081.80610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204081.80636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204081.80699: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204081.80952: variable 'network_connections' from source: include params 13830 1727204081.80957: variable 'controller_profile' from source: play vars 13830 1727204081.81020: variable 'controller_profile' from source: play vars 13830 1727204081.81026: variable 'controller_device' from source: play vars 13830 1727204081.81089: variable 'controller_device' from source: play vars 13830 1727204081.81100: variable 'port1_profile' from source: play vars 13830 1727204081.81161: variable 'port1_profile' from source: play vars 13830 1727204081.81168: variable 'dhcp_interface1' from source: play vars 13830 1727204081.81227: variable 'dhcp_interface1' from source: play vars 13830 1727204081.81235: variable 'controller_profile' from source: play vars 13830 1727204081.81998: variable 'controller_profile' from source: play vars 13830 1727204081.82004: variable 'port2_profile' from source: play vars 13830 1727204081.82071: variable 'port2_profile' from source: play vars 13830 1727204081.82074: variable 'dhcp_interface2' from source: play vars 13830 1727204081.82135: variable 'dhcp_interface2' from source: play vars 13830 1727204081.82142: variable 'controller_profile' from source: play vars 13830 1727204081.82203: variable 'controller_profile' from source: play vars 13830 1727204081.82240: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204081.82244: when evaluation is False, skipping this task 13830 1727204081.82246: _execute() done 13830 1727204081.82249: dumping result to json 13830 1727204081.82251: done dumping result, returning 13830 1727204081.82261: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-00000000027c] 13830 1727204081.82267: sending task result for task 0affcd87-79f5-1659-6b02-00000000027c 13830 1727204081.82361: done sending task result for task 0affcd87-79f5-1659-6b02-00000000027c 13830 1727204081.82366: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204081.82438: no more pending results, returning what we have 13830 1727204081.82442: results queue empty 13830 1727204081.82442: checking for any_errors_fatal 13830 1727204081.82449: done checking for any_errors_fatal 13830 1727204081.82450: checking for max_fail_percentage 13830 1727204081.82452: done checking for max_fail_percentage 13830 1727204081.82453: checking to see if all hosts have failed and the running result is not ok 13830 1727204081.82453: done checking to see if all hosts have failed 13830 1727204081.82454: getting the remaining hosts for this loop 13830 1727204081.82456: done getting the remaining hosts for this loop 13830 1727204081.82460: getting the next task for host managed-node3 13830 1727204081.82469: done getting next task for host managed-node3 13830 1727204081.82473: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204081.82478: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204081.82491: getting variables 13830 1727204081.82493: in VariableManager get_vars() 13830 1727204081.82525: Calling all_inventory to load vars for managed-node3 13830 1727204081.82527: Calling groups_inventory to load vars for managed-node3 13830 1727204081.82529: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204081.82538: Calling all_plugins_play to load vars for managed-node3 13830 1727204081.82540: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204081.82543: Calling groups_plugins_play to load vars for managed-node3 13830 1727204081.84892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.87361: done with get_vars() 13830 1727204081.87384: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204081.87461: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.205) 0:00:14.953 ***** 13830 1727204081.87497: entering _queue_task() for managed-node3/yum 13830 1727204081.87499: Creating lock for yum 13830 1727204081.87832: worker is 1 (out of 1 available) 13830 1727204081.87847: exiting _queue_task() for managed-node3/yum 13830 1727204081.87861: done queuing things up, now waiting for results queue to drain 13830 1727204081.87862: waiting for pending results... 13830 1727204081.88990: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204081.89148: in run() - task 0affcd87-79f5-1659-6b02-00000000027d 13830 1727204081.89307: variable 'ansible_search_path' from source: unknown 13830 1727204081.89316: variable 'ansible_search_path' from source: unknown 13830 1727204081.89357: calling self._execute() 13830 1727204081.89562: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204081.89576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204081.89590: variable 'omit' from source: magic vars 13830 1727204081.90384: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.90401: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204081.90694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204081.94646: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204081.94739: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204081.94785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204081.94827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204081.94862: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204081.94958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204081.94993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204081.95025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204081.95082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204081.95102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204081.95212: variable 'ansible_distribution_major_version' from source: facts 13830 1727204081.95235: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13830 1727204081.95244: when evaluation is False, skipping this task 13830 1727204081.95250: _execute() done 13830 1727204081.95262: dumping result to json 13830 1727204081.95276: done dumping result, returning 13830 1727204081.95288: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-00000000027d] 13830 1727204081.95299: sending task result for task 0affcd87-79f5-1659-6b02-00000000027d skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13830 1727204081.95462: no more pending results, returning what we have 13830 1727204081.95468: results queue empty 13830 1727204081.95469: checking for any_errors_fatal 13830 1727204081.95477: done checking for any_errors_fatal 13830 1727204081.95478: checking for max_fail_percentage 13830 1727204081.95480: done checking for max_fail_percentage 13830 1727204081.95481: checking to see if all hosts have failed and the running result is not ok 13830 1727204081.95481: done checking to see if all hosts have failed 13830 1727204081.95482: getting the remaining hosts for this loop 13830 1727204081.95484: done getting the remaining hosts for this loop 13830 1727204081.95489: getting the next task for host managed-node3 13830 1727204081.95498: done getting next task for host managed-node3 13830 1727204081.95502: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204081.95507: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204081.95522: getting variables 13830 1727204081.95524: in VariableManager get_vars() 13830 1727204081.95566: Calling all_inventory to load vars for managed-node3 13830 1727204081.95570: Calling groups_inventory to load vars for managed-node3 13830 1727204081.95573: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204081.95584: Calling all_plugins_play to load vars for managed-node3 13830 1727204081.95586: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204081.95589: Calling groups_plugins_play to load vars for managed-node3 13830 1727204081.96615: done sending task result for task 0affcd87-79f5-1659-6b02-00000000027d 13830 1727204081.96619: WORKER PROCESS EXITING 13830 1727204081.97980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204081.99957: done with get_vars() 13830 1727204081.99989: done getting variables 13830 1727204082.00058: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.125) 0:00:15.079 ***** 13830 1727204082.00095: entering _queue_task() for managed-node3/fail 13830 1727204082.00420: worker is 1 (out of 1 available) 13830 1727204082.00438: exiting _queue_task() for managed-node3/fail 13830 1727204082.00452: done queuing things up, now waiting for results queue to drain 13830 1727204082.00453: waiting for pending results... 13830 1727204082.00741: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204082.00906: in run() - task 0affcd87-79f5-1659-6b02-00000000027e 13830 1727204082.00926: variable 'ansible_search_path' from source: unknown 13830 1727204082.00936: variable 'ansible_search_path' from source: unknown 13830 1727204082.00978: calling self._execute() 13830 1727204082.01070: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.01081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.01099: variable 'omit' from source: magic vars 13830 1727204082.01561: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.01582: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204082.01714: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204082.01897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204082.05459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204082.05513: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204082.05543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204082.05570: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204082.05591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204082.05652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.05674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.05691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.05728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.05755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.05789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.05805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.05873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.05915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.05934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.05985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.06014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.06044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.06092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.06111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.06288: variable 'network_connections' from source: include params 13830 1727204082.06307: variable 'controller_profile' from source: play vars 13830 1727204082.06383: variable 'controller_profile' from source: play vars 13830 1727204082.06394: variable 'controller_device' from source: play vars 13830 1727204082.06459: variable 'controller_device' from source: play vars 13830 1727204082.06475: variable 'port1_profile' from source: play vars 13830 1727204082.06537: variable 'port1_profile' from source: play vars 13830 1727204082.06544: variable 'dhcp_interface1' from source: play vars 13830 1727204082.06605: variable 'dhcp_interface1' from source: play vars 13830 1727204082.06611: variable 'controller_profile' from source: play vars 13830 1727204082.06694: variable 'controller_profile' from source: play vars 13830 1727204082.06703: variable 'port2_profile' from source: play vars 13830 1727204082.06787: variable 'port2_profile' from source: play vars 13830 1727204082.06794: variable 'dhcp_interface2' from source: play vars 13830 1727204082.06988: variable 'dhcp_interface2' from source: play vars 13830 1727204082.06995: variable 'controller_profile' from source: play vars 13830 1727204082.07053: variable 'controller_profile' from source: play vars 13830 1727204082.07128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204082.07311: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204082.07349: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204082.07384: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204082.07412: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204082.07480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204082.07502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204082.07535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.07560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204082.07624: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204082.07901: variable 'network_connections' from source: include params 13830 1727204082.07905: variable 'controller_profile' from source: play vars 13830 1727204082.07969: variable 'controller_profile' from source: play vars 13830 1727204082.07976: variable 'controller_device' from source: play vars 13830 1727204082.08035: variable 'controller_device' from source: play vars 13830 1727204082.08044: variable 'port1_profile' from source: play vars 13830 1727204082.08107: variable 'port1_profile' from source: play vars 13830 1727204082.08113: variable 'dhcp_interface1' from source: play vars 13830 1727204082.08172: variable 'dhcp_interface1' from source: play vars 13830 1727204082.08182: variable 'controller_profile' from source: play vars 13830 1727204082.08240: variable 'controller_profile' from source: play vars 13830 1727204082.08248: variable 'port2_profile' from source: play vars 13830 1727204082.08311: variable 'port2_profile' from source: play vars 13830 1727204082.08318: variable 'dhcp_interface2' from source: play vars 13830 1727204082.08376: variable 'dhcp_interface2' from source: play vars 13830 1727204082.08382: variable 'controller_profile' from source: play vars 13830 1727204082.08445: variable 'controller_profile' from source: play vars 13830 1727204082.08479: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204082.08482: when evaluation is False, skipping this task 13830 1727204082.08485: _execute() done 13830 1727204082.08487: dumping result to json 13830 1727204082.08489: done dumping result, returning 13830 1727204082.08498: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-00000000027e] 13830 1727204082.08507: sending task result for task 0affcd87-79f5-1659-6b02-00000000027e 13830 1727204082.08605: done sending task result for task 0affcd87-79f5-1659-6b02-00000000027e 13830 1727204082.08607: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204082.08661: no more pending results, returning what we have 13830 1727204082.08667: results queue empty 13830 1727204082.08668: checking for any_errors_fatal 13830 1727204082.08673: done checking for any_errors_fatal 13830 1727204082.08674: checking for max_fail_percentage 13830 1727204082.08676: done checking for max_fail_percentage 13830 1727204082.08677: checking to see if all hosts have failed and the running result is not ok 13830 1727204082.08677: done checking to see if all hosts have failed 13830 1727204082.08678: getting the remaining hosts for this loop 13830 1727204082.08679: done getting the remaining hosts for this loop 13830 1727204082.08683: getting the next task for host managed-node3 13830 1727204082.08690: done getting next task for host managed-node3 13830 1727204082.08694: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13830 1727204082.08699: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204082.08712: getting variables 13830 1727204082.08714: in VariableManager get_vars() 13830 1727204082.08753: Calling all_inventory to load vars for managed-node3 13830 1727204082.08756: Calling groups_inventory to load vars for managed-node3 13830 1727204082.08758: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204082.08769: Calling all_plugins_play to load vars for managed-node3 13830 1727204082.08771: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204082.08774: Calling groups_plugins_play to load vars for managed-node3 13830 1727204082.10688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204082.12563: done with get_vars() 13830 1727204082.12596: done getting variables 13830 1727204082.12665: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.126) 0:00:15.205 ***** 13830 1727204082.12708: entering _queue_task() for managed-node3/package 13830 1727204082.13452: worker is 1 (out of 1 available) 13830 1727204082.13468: exiting _queue_task() for managed-node3/package 13830 1727204082.13482: done queuing things up, now waiting for results queue to drain 13830 1727204082.13484: waiting for pending results... 13830 1727204082.14223: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 13830 1727204082.14528: in run() - task 0affcd87-79f5-1659-6b02-00000000027f 13830 1727204082.14574: variable 'ansible_search_path' from source: unknown 13830 1727204082.14583: variable 'ansible_search_path' from source: unknown 13830 1727204082.14628: calling self._execute() 13830 1727204082.14733: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.14746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.14769: variable 'omit' from source: magic vars 13830 1727204082.15166: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.15191: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204082.15411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204082.15707: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204082.15762: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204082.15807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204082.15852: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204082.15978: variable 'network_packages' from source: role '' defaults 13830 1727204082.16096: variable '__network_provider_setup' from source: role '' defaults 13830 1727204082.16111: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204082.16272: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204082.16286: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204082.16975: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204082.17484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204082.20081: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204082.20208: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204082.20255: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204082.20326: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204082.20362: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204082.20455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.20491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.20534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.20584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.20606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.20668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.20697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.20726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.20783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.20802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.21053: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204082.21186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.21218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.21252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.21304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.21322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.21425: variable 'ansible_python' from source: facts 13830 1727204082.21449: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204082.21544: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204082.21638: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204082.21780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.21807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.21847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.21928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.21956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.22004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.22047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.22082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.22124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.22148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.22322: variable 'network_connections' from source: include params 13830 1727204082.22337: variable 'controller_profile' from source: play vars 13830 1727204082.22461: variable 'controller_profile' from source: play vars 13830 1727204082.22506: variable 'controller_device' from source: play vars 13830 1727204082.22633: variable 'controller_device' from source: play vars 13830 1727204082.22655: variable 'port1_profile' from source: play vars 13830 1727204082.22768: variable 'port1_profile' from source: play vars 13830 1727204082.22783: variable 'dhcp_interface1' from source: play vars 13830 1727204082.22907: variable 'dhcp_interface1' from source: play vars 13830 1727204082.22926: variable 'controller_profile' from source: play vars 13830 1727204082.23039: variable 'controller_profile' from source: play vars 13830 1727204082.23053: variable 'port2_profile' from source: play vars 13830 1727204082.23170: variable 'port2_profile' from source: play vars 13830 1727204082.23185: variable 'dhcp_interface2' from source: play vars 13830 1727204082.23295: variable 'dhcp_interface2' from source: play vars 13830 1727204082.23308: variable 'controller_profile' from source: play vars 13830 1727204082.23419: variable 'controller_profile' from source: play vars 13830 1727204082.23510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204082.23547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204082.23595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.23635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204082.23698: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204082.24022: variable 'network_connections' from source: include params 13830 1727204082.24039: variable 'controller_profile' from source: play vars 13830 1727204082.24154: variable 'controller_profile' from source: play vars 13830 1727204082.24170: variable 'controller_device' from source: play vars 13830 1727204082.24286: variable 'controller_device' from source: play vars 13830 1727204082.24311: variable 'port1_profile' from source: play vars 13830 1727204082.24438: variable 'port1_profile' from source: play vars 13830 1727204082.24457: variable 'dhcp_interface1' from source: play vars 13830 1727204082.24575: variable 'dhcp_interface1' from source: play vars 13830 1727204082.24587: variable 'controller_profile' from source: play vars 13830 1727204082.24698: variable 'controller_profile' from source: play vars 13830 1727204082.24712: variable 'port2_profile' from source: play vars 13830 1727204082.24825: variable 'port2_profile' from source: play vars 13830 1727204082.24841: variable 'dhcp_interface2' from source: play vars 13830 1727204082.24986: variable 'dhcp_interface2' from source: play vars 13830 1727204082.25011: variable 'controller_profile' from source: play vars 13830 1727204082.25133: variable 'controller_profile' from source: play vars 13830 1727204082.25198: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204082.25356: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204082.25927: variable 'network_connections' from source: include params 13830 1727204082.25941: variable 'controller_profile' from source: play vars 13830 1727204082.26052: variable 'controller_profile' from source: play vars 13830 1727204082.26063: variable 'controller_device' from source: play vars 13830 1727204082.26139: variable 'controller_device' from source: play vars 13830 1727204082.26157: variable 'port1_profile' from source: play vars 13830 1727204082.26255: variable 'port1_profile' from source: play vars 13830 1727204082.26271: variable 'dhcp_interface1' from source: play vars 13830 1727204082.26411: variable 'dhcp_interface1' from source: play vars 13830 1727204082.26421: variable 'controller_profile' from source: play vars 13830 1727204082.26514: variable 'controller_profile' from source: play vars 13830 1727204082.26532: variable 'port2_profile' from source: play vars 13830 1727204082.26601: variable 'port2_profile' from source: play vars 13830 1727204082.26607: variable 'dhcp_interface2' from source: play vars 13830 1727204082.26653: variable 'dhcp_interface2' from source: play vars 13830 1727204082.26659: variable 'controller_profile' from source: play vars 13830 1727204082.26709: variable 'controller_profile' from source: play vars 13830 1727204082.26739: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204082.26796: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204082.27003: variable 'network_connections' from source: include params 13830 1727204082.27009: variable 'controller_profile' from source: play vars 13830 1727204082.27056: variable 'controller_profile' from source: play vars 13830 1727204082.27062: variable 'controller_device' from source: play vars 13830 1727204082.27108: variable 'controller_device' from source: play vars 13830 1727204082.27119: variable 'port1_profile' from source: play vars 13830 1727204082.27167: variable 'port1_profile' from source: play vars 13830 1727204082.27173: variable 'dhcp_interface1' from source: play vars 13830 1727204082.27218: variable 'dhcp_interface1' from source: play vars 13830 1727204082.27223: variable 'controller_profile' from source: play vars 13830 1727204082.27272: variable 'controller_profile' from source: play vars 13830 1727204082.27278: variable 'port2_profile' from source: play vars 13830 1727204082.27322: variable 'port2_profile' from source: play vars 13830 1727204082.27328: variable 'dhcp_interface2' from source: play vars 13830 1727204082.27377: variable 'dhcp_interface2' from source: play vars 13830 1727204082.27382: variable 'controller_profile' from source: play vars 13830 1727204082.27427: variable 'controller_profile' from source: play vars 13830 1727204082.27495: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204082.27538: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204082.27544: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204082.27590: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204082.27724: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204082.28044: variable 'network_connections' from source: include params 13830 1727204082.28048: variable 'controller_profile' from source: play vars 13830 1727204082.28092: variable 'controller_profile' from source: play vars 13830 1727204082.28101: variable 'controller_device' from source: play vars 13830 1727204082.28143: variable 'controller_device' from source: play vars 13830 1727204082.28153: variable 'port1_profile' from source: play vars 13830 1727204082.28196: variable 'port1_profile' from source: play vars 13830 1727204082.28201: variable 'dhcp_interface1' from source: play vars 13830 1727204082.28246: variable 'dhcp_interface1' from source: play vars 13830 1727204082.28252: variable 'controller_profile' from source: play vars 13830 1727204082.28298: variable 'controller_profile' from source: play vars 13830 1727204082.28301: variable 'port2_profile' from source: play vars 13830 1727204082.28345: variable 'port2_profile' from source: play vars 13830 1727204082.28351: variable 'dhcp_interface2' from source: play vars 13830 1727204082.28393: variable 'dhcp_interface2' from source: play vars 13830 1727204082.28398: variable 'controller_profile' from source: play vars 13830 1727204082.28447: variable 'controller_profile' from source: play vars 13830 1727204082.28481: variable 'ansible_distribution' from source: facts 13830 1727204082.28502: variable '__network_rh_distros' from source: role '' defaults 13830 1727204082.28506: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.28551: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204082.29450: variable 'ansible_distribution' from source: facts 13830 1727204082.29453: variable '__network_rh_distros' from source: role '' defaults 13830 1727204082.29461: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.29465: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204082.29468: variable 'ansible_distribution' from source: facts 13830 1727204082.29470: variable '__network_rh_distros' from source: role '' defaults 13830 1727204082.29472: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.29474: variable 'network_provider' from source: set_fact 13830 1727204082.29476: variable 'ansible_facts' from source: unknown 13830 1727204082.29841: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13830 1727204082.29850: when evaluation is False, skipping this task 13830 1727204082.29856: _execute() done 13830 1727204082.29863: dumping result to json 13830 1727204082.29873: done dumping result, returning 13830 1727204082.29884: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1659-6b02-00000000027f] 13830 1727204082.29895: sending task result for task 0affcd87-79f5-1659-6b02-00000000027f skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13830 1727204082.30060: no more pending results, returning what we have 13830 1727204082.30067: results queue empty 13830 1727204082.30068: checking for any_errors_fatal 13830 1727204082.30075: done checking for any_errors_fatal 13830 1727204082.30075: checking for max_fail_percentage 13830 1727204082.30077: done checking for max_fail_percentage 13830 1727204082.30078: checking to see if all hosts have failed and the running result is not ok 13830 1727204082.30079: done checking to see if all hosts have failed 13830 1727204082.30079: getting the remaining hosts for this loop 13830 1727204082.30081: done getting the remaining hosts for this loop 13830 1727204082.30085: getting the next task for host managed-node3 13830 1727204082.30092: done getting next task for host managed-node3 13830 1727204082.30096: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204082.30101: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204082.30120: getting variables 13830 1727204082.30121: in VariableManager get_vars() 13830 1727204082.30157: Calling all_inventory to load vars for managed-node3 13830 1727204082.30160: Calling groups_inventory to load vars for managed-node3 13830 1727204082.30162: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204082.30176: done sending task result for task 0affcd87-79f5-1659-6b02-00000000027f 13830 1727204082.30180: WORKER PROCESS EXITING 13830 1727204082.30190: Calling all_plugins_play to load vars for managed-node3 13830 1727204082.30193: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204082.30195: Calling groups_plugins_play to load vars for managed-node3 13830 1727204082.31461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204082.32389: done with get_vars() 13830 1727204082.32408: done getting variables 13830 1727204082.32457: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.197) 0:00:15.402 ***** 13830 1727204082.32483: entering _queue_task() for managed-node3/package 13830 1727204082.32711: worker is 1 (out of 1 available) 13830 1727204082.32723: exiting _queue_task() for managed-node3/package 13830 1727204082.32736: done queuing things up, now waiting for results queue to drain 13830 1727204082.32738: waiting for pending results... 13830 1727204082.32947: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204082.33115: in run() - task 0affcd87-79f5-1659-6b02-000000000280 13830 1727204082.33140: variable 'ansible_search_path' from source: unknown 13830 1727204082.33148: variable 'ansible_search_path' from source: unknown 13830 1727204082.33191: calling self._execute() 13830 1727204082.33288: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.33302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.33318: variable 'omit' from source: magic vars 13830 1727204082.33677: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.33695: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204082.33878: variable 'network_state' from source: role '' defaults 13830 1727204082.33905: Evaluated conditional (network_state != {}): False 13830 1727204082.33923: when evaluation is False, skipping this task 13830 1727204082.33947: _execute() done 13830 1727204082.33954: dumping result to json 13830 1727204082.33961: done dumping result, returning 13830 1727204082.33973: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1659-6b02-000000000280] 13830 1727204082.33985: sending task result for task 0affcd87-79f5-1659-6b02-000000000280 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204082.34211: no more pending results, returning what we have 13830 1727204082.34216: results queue empty 13830 1727204082.34217: checking for any_errors_fatal 13830 1727204082.34223: done checking for any_errors_fatal 13830 1727204082.34224: checking for max_fail_percentage 13830 1727204082.34226: done checking for max_fail_percentage 13830 1727204082.34227: checking to see if all hosts have failed and the running result is not ok 13830 1727204082.34228: done checking to see if all hosts have failed 13830 1727204082.34228: getting the remaining hosts for this loop 13830 1727204082.34230: done getting the remaining hosts for this loop 13830 1727204082.34234: getting the next task for host managed-node3 13830 1727204082.34242: done getting next task for host managed-node3 13830 1727204082.34246: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204082.34251: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204082.34266: done sending task result for task 0affcd87-79f5-1659-6b02-000000000280 13830 1727204082.34270: WORKER PROCESS EXITING 13830 1727204082.34277: getting variables 13830 1727204082.34278: in VariableManager get_vars() 13830 1727204082.34309: Calling all_inventory to load vars for managed-node3 13830 1727204082.34311: Calling groups_inventory to load vars for managed-node3 13830 1727204082.34313: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204082.34322: Calling all_plugins_play to load vars for managed-node3 13830 1727204082.34325: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204082.34328: Calling groups_plugins_play to load vars for managed-node3 13830 1727204082.37832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204082.38757: done with get_vars() 13830 1727204082.38778: done getting variables 13830 1727204082.38817: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.063) 0:00:15.466 ***** 13830 1727204082.38843: entering _queue_task() for managed-node3/package 13830 1727204082.39080: worker is 1 (out of 1 available) 13830 1727204082.39094: exiting _queue_task() for managed-node3/package 13830 1727204082.39106: done queuing things up, now waiting for results queue to drain 13830 1727204082.39108: waiting for pending results... 13830 1727204082.39290: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204082.39393: in run() - task 0affcd87-79f5-1659-6b02-000000000281 13830 1727204082.39403: variable 'ansible_search_path' from source: unknown 13830 1727204082.39407: variable 'ansible_search_path' from source: unknown 13830 1727204082.39435: calling self._execute() 13830 1727204082.39504: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.39508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.39517: variable 'omit' from source: magic vars 13830 1727204082.39783: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.39797: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204082.39883: variable 'network_state' from source: role '' defaults 13830 1727204082.39890: Evaluated conditional (network_state != {}): False 13830 1727204082.39893: when evaluation is False, skipping this task 13830 1727204082.39896: _execute() done 13830 1727204082.39900: dumping result to json 13830 1727204082.39902: done dumping result, returning 13830 1727204082.39909: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1659-6b02-000000000281] 13830 1727204082.39913: sending task result for task 0affcd87-79f5-1659-6b02-000000000281 13830 1727204082.40015: done sending task result for task 0affcd87-79f5-1659-6b02-000000000281 13830 1727204082.40018: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204082.40072: no more pending results, returning what we have 13830 1727204082.40077: results queue empty 13830 1727204082.40077: checking for any_errors_fatal 13830 1727204082.40086: done checking for any_errors_fatal 13830 1727204082.40087: checking for max_fail_percentage 13830 1727204082.40088: done checking for max_fail_percentage 13830 1727204082.40089: checking to see if all hosts have failed and the running result is not ok 13830 1727204082.40089: done checking to see if all hosts have failed 13830 1727204082.40090: getting the remaining hosts for this loop 13830 1727204082.40092: done getting the remaining hosts for this loop 13830 1727204082.40096: getting the next task for host managed-node3 13830 1727204082.40103: done getting next task for host managed-node3 13830 1727204082.40107: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204082.40112: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204082.40135: getting variables 13830 1727204082.40137: in VariableManager get_vars() 13830 1727204082.40168: Calling all_inventory to load vars for managed-node3 13830 1727204082.40171: Calling groups_inventory to load vars for managed-node3 13830 1727204082.40173: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204082.40181: Calling all_plugins_play to load vars for managed-node3 13830 1727204082.40183: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204082.40186: Calling groups_plugins_play to load vars for managed-node3 13830 1727204082.40970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204082.41914: done with get_vars() 13830 1727204082.41934: done getting variables 13830 1727204082.42015: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.031) 0:00:15.498 ***** 13830 1727204082.42042: entering _queue_task() for managed-node3/service 13830 1727204082.42043: Creating lock for service 13830 1727204082.42281: worker is 1 (out of 1 available) 13830 1727204082.42296: exiting _queue_task() for managed-node3/service 13830 1727204082.42308: done queuing things up, now waiting for results queue to drain 13830 1727204082.42310: waiting for pending results... 13830 1727204082.42494: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204082.42593: in run() - task 0affcd87-79f5-1659-6b02-000000000282 13830 1727204082.42604: variable 'ansible_search_path' from source: unknown 13830 1727204082.42608: variable 'ansible_search_path' from source: unknown 13830 1727204082.42642: calling self._execute() 13830 1727204082.42716: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.42721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.42729: variable 'omit' from source: magic vars 13830 1727204082.43014: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.43024: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204082.43113: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204082.43249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204082.45087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204082.45136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204082.45173: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204082.45198: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204082.45219: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204082.45281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.45300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.45318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.45346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.45358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.45393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.45409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.45426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.45452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.45462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.45494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.45510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.45526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.45553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.45563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.45675: variable 'network_connections' from source: include params 13830 1727204082.45685: variable 'controller_profile' from source: play vars 13830 1727204082.45736: variable 'controller_profile' from source: play vars 13830 1727204082.45744: variable 'controller_device' from source: play vars 13830 1727204082.45790: variable 'controller_device' from source: play vars 13830 1727204082.45802: variable 'port1_profile' from source: play vars 13830 1727204082.45845: variable 'port1_profile' from source: play vars 13830 1727204082.45851: variable 'dhcp_interface1' from source: play vars 13830 1727204082.45895: variable 'dhcp_interface1' from source: play vars 13830 1727204082.45899: variable 'controller_profile' from source: play vars 13830 1727204082.45945: variable 'controller_profile' from source: play vars 13830 1727204082.45951: variable 'port2_profile' from source: play vars 13830 1727204082.45994: variable 'port2_profile' from source: play vars 13830 1727204082.46000: variable 'dhcp_interface2' from source: play vars 13830 1727204082.46045: variable 'dhcp_interface2' from source: play vars 13830 1727204082.46051: variable 'controller_profile' from source: play vars 13830 1727204082.46094: variable 'controller_profile' from source: play vars 13830 1727204082.46146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204082.46256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204082.46293: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204082.46315: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204082.46337: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204082.46373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204082.46388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204082.46405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.46423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204082.46475: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204082.46624: variable 'network_connections' from source: include params 13830 1727204082.46628: variable 'controller_profile' from source: play vars 13830 1727204082.46673: variable 'controller_profile' from source: play vars 13830 1727204082.46680: variable 'controller_device' from source: play vars 13830 1727204082.46721: variable 'controller_device' from source: play vars 13830 1727204082.46733: variable 'port1_profile' from source: play vars 13830 1727204082.46774: variable 'port1_profile' from source: play vars 13830 1727204082.46779: variable 'dhcp_interface1' from source: play vars 13830 1727204082.46824: variable 'dhcp_interface1' from source: play vars 13830 1727204082.46832: variable 'controller_profile' from source: play vars 13830 1727204082.46873: variable 'controller_profile' from source: play vars 13830 1727204082.46879: variable 'port2_profile' from source: play vars 13830 1727204082.46924: variable 'port2_profile' from source: play vars 13830 1727204082.46932: variable 'dhcp_interface2' from source: play vars 13830 1727204082.46973: variable 'dhcp_interface2' from source: play vars 13830 1727204082.46979: variable 'controller_profile' from source: play vars 13830 1727204082.47022: variable 'controller_profile' from source: play vars 13830 1727204082.47048: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204082.47051: when evaluation is False, skipping this task 13830 1727204082.47054: _execute() done 13830 1727204082.47056: dumping result to json 13830 1727204082.47058: done dumping result, returning 13830 1727204082.47066: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000282] 13830 1727204082.47071: sending task result for task 0affcd87-79f5-1659-6b02-000000000282 13830 1727204082.47163: done sending task result for task 0affcd87-79f5-1659-6b02-000000000282 13830 1727204082.47168: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204082.47212: no more pending results, returning what we have 13830 1727204082.47221: results queue empty 13830 1727204082.47222: checking for any_errors_fatal 13830 1727204082.47228: done checking for any_errors_fatal 13830 1727204082.47229: checking for max_fail_percentage 13830 1727204082.47233: done checking for max_fail_percentage 13830 1727204082.47234: checking to see if all hosts have failed and the running result is not ok 13830 1727204082.47234: done checking to see if all hosts have failed 13830 1727204082.47235: getting the remaining hosts for this loop 13830 1727204082.47237: done getting the remaining hosts for this loop 13830 1727204082.47241: getting the next task for host managed-node3 13830 1727204082.47248: done getting next task for host managed-node3 13830 1727204082.47252: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204082.47258: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204082.47273: getting variables 13830 1727204082.47275: in VariableManager get_vars() 13830 1727204082.47310: Calling all_inventory to load vars for managed-node3 13830 1727204082.47313: Calling groups_inventory to load vars for managed-node3 13830 1727204082.47315: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204082.47333: Calling all_plugins_play to load vars for managed-node3 13830 1727204082.47336: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204082.47339: Calling groups_plugins_play to load vars for managed-node3 13830 1727204082.48281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204082.49690: done with get_vars() 13830 1727204082.49714: done getting variables 13830 1727204082.49778: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.077) 0:00:15.576 ***** 13830 1727204082.49811: entering _queue_task() for managed-node3/service 13830 1727204082.50081: worker is 1 (out of 1 available) 13830 1727204082.50096: exiting _queue_task() for managed-node3/service 13830 1727204082.50109: done queuing things up, now waiting for results queue to drain 13830 1727204082.50111: waiting for pending results... 13830 1727204082.50306: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204082.50414: in run() - task 0affcd87-79f5-1659-6b02-000000000283 13830 1727204082.50423: variable 'ansible_search_path' from source: unknown 13830 1727204082.50427: variable 'ansible_search_path' from source: unknown 13830 1727204082.50458: calling self._execute() 13830 1727204082.50529: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.50535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.50543: variable 'omit' from source: magic vars 13830 1727204082.50822: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.50835: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204082.50952: variable 'network_provider' from source: set_fact 13830 1727204082.50955: variable 'network_state' from source: role '' defaults 13830 1727204082.50963: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13830 1727204082.50970: variable 'omit' from source: magic vars 13830 1727204082.51011: variable 'omit' from source: magic vars 13830 1727204082.51030: variable 'network_service_name' from source: role '' defaults 13830 1727204082.51082: variable 'network_service_name' from source: role '' defaults 13830 1727204082.51157: variable '__network_provider_setup' from source: role '' defaults 13830 1727204082.51161: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204082.51208: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204082.51215: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204082.51265: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204082.51412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204082.53747: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204082.53841: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204082.53895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204082.53934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204082.53967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204082.54054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.54093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.54129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.54177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.54196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.54250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.54280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.54307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.54358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.54379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.54627: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204082.54760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.54791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.54821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.54872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.54890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.54992: variable 'ansible_python' from source: facts 13830 1727204082.55011: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204082.55109: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204082.55202: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204082.55340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.55371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.55403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.55454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.55475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.55533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204082.55573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204082.55601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.55654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204082.55676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204082.55827: variable 'network_connections' from source: include params 13830 1727204082.55841: variable 'controller_profile' from source: play vars 13830 1727204082.55928: variable 'controller_profile' from source: play vars 13830 1727204082.55946: variable 'controller_device' from source: play vars 13830 1727204082.56033: variable 'controller_device' from source: play vars 13830 1727204082.56056: variable 'port1_profile' from source: play vars 13830 1727204082.56139: variable 'port1_profile' from source: play vars 13830 1727204082.56154: variable 'dhcp_interface1' from source: play vars 13830 1727204082.56243: variable 'dhcp_interface1' from source: play vars 13830 1727204082.56258: variable 'controller_profile' from source: play vars 13830 1727204082.56342: variable 'controller_profile' from source: play vars 13830 1727204082.56358: variable 'port2_profile' from source: play vars 13830 1727204082.56443: variable 'port2_profile' from source: play vars 13830 1727204082.56459: variable 'dhcp_interface2' from source: play vars 13830 1727204082.56546: variable 'dhcp_interface2' from source: play vars 13830 1727204082.56561: variable 'controller_profile' from source: play vars 13830 1727204082.56647: variable 'controller_profile' from source: play vars 13830 1727204082.56772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204082.57010: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204082.57078: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204082.57127: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204082.57182: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204082.57251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204082.57295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204082.57331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204082.57376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204082.57431: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204082.57743: variable 'network_connections' from source: include params 13830 1727204082.57754: variable 'controller_profile' from source: play vars 13830 1727204082.57843: variable 'controller_profile' from source: play vars 13830 1727204082.57858: variable 'controller_device' from source: play vars 13830 1727204082.57943: variable 'controller_device' from source: play vars 13830 1727204082.57966: variable 'port1_profile' from source: play vars 13830 1727204082.58050: variable 'port1_profile' from source: play vars 13830 1727204082.58068: variable 'dhcp_interface1' from source: play vars 13830 1727204082.58150: variable 'dhcp_interface1' from source: play vars 13830 1727204082.58167: variable 'controller_profile' from source: play vars 13830 1727204082.58247: variable 'controller_profile' from source: play vars 13830 1727204082.58267: variable 'port2_profile' from source: play vars 13830 1727204082.58344: variable 'port2_profile' from source: play vars 13830 1727204082.58369: variable 'dhcp_interface2' from source: play vars 13830 1727204082.58446: variable 'dhcp_interface2' from source: play vars 13830 1727204082.58468: variable 'controller_profile' from source: play vars 13830 1727204082.58545: variable 'controller_profile' from source: play vars 13830 1727204082.58610: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204082.58704: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204082.59035: variable 'network_connections' from source: include params 13830 1727204082.59046: variable 'controller_profile' from source: play vars 13830 1727204082.59129: variable 'controller_profile' from source: play vars 13830 1727204082.59140: variable 'controller_device' from source: play vars 13830 1727204082.59202: variable 'controller_device' from source: play vars 13830 1727204082.59219: variable 'port1_profile' from source: play vars 13830 1727204082.59290: variable 'port1_profile' from source: play vars 13830 1727204082.59301: variable 'dhcp_interface1' from source: play vars 13830 1727204082.59386: variable 'dhcp_interface1' from source: play vars 13830 1727204082.59398: variable 'controller_profile' from source: play vars 13830 1727204082.59481: variable 'controller_profile' from source: play vars 13830 1727204082.59493: variable 'port2_profile' from source: play vars 13830 1727204082.59578: variable 'port2_profile' from source: play vars 13830 1727204082.59591: variable 'dhcp_interface2' from source: play vars 13830 1727204082.59672: variable 'dhcp_interface2' from source: play vars 13830 1727204082.59687: variable 'controller_profile' from source: play vars 13830 1727204082.59759: variable 'controller_profile' from source: play vars 13830 1727204082.59802: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204082.59891: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204082.60244: variable 'network_connections' from source: include params 13830 1727204082.60255: variable 'controller_profile' from source: play vars 13830 1727204082.60337: variable 'controller_profile' from source: play vars 13830 1727204082.60350: variable 'controller_device' from source: play vars 13830 1727204082.60428: variable 'controller_device' from source: play vars 13830 1727204082.60449: variable 'port1_profile' from source: play vars 13830 1727204082.60523: variable 'port1_profile' from source: play vars 13830 1727204082.60543: variable 'dhcp_interface1' from source: play vars 13830 1727204082.60620: variable 'dhcp_interface1' from source: play vars 13830 1727204082.60631: variable 'controller_profile' from source: play vars 13830 1727204082.60716: variable 'controller_profile' from source: play vars 13830 1727204082.60728: variable 'port2_profile' from source: play vars 13830 1727204082.60813: variable 'port2_profile' from source: play vars 13830 1727204082.60824: variable 'dhcp_interface2' from source: play vars 13830 1727204082.60905: variable 'dhcp_interface2' from source: play vars 13830 1727204082.60916: variable 'controller_profile' from source: play vars 13830 1727204082.60996: variable 'controller_profile' from source: play vars 13830 1727204082.61065: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204082.61141: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204082.61153: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204082.61225: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204082.61431: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204082.61968: variable 'network_connections' from source: include params 13830 1727204082.61980: variable 'controller_profile' from source: play vars 13830 1727204082.62042: variable 'controller_profile' from source: play vars 13830 1727204082.62058: variable 'controller_device' from source: play vars 13830 1727204082.62125: variable 'controller_device' from source: play vars 13830 1727204082.62143: variable 'port1_profile' from source: play vars 13830 1727204082.62213: variable 'port1_profile' from source: play vars 13830 1727204082.62225: variable 'dhcp_interface1' from source: play vars 13830 1727204082.62294: variable 'dhcp_interface1' from source: play vars 13830 1727204082.62306: variable 'controller_profile' from source: play vars 13830 1727204082.62367: variable 'controller_profile' from source: play vars 13830 1727204082.62383: variable 'port2_profile' from source: play vars 13830 1727204082.62446: variable 'port2_profile' from source: play vars 13830 1727204082.62457: variable 'dhcp_interface2' from source: play vars 13830 1727204082.62529: variable 'dhcp_interface2' from source: play vars 13830 1727204082.62541: variable 'controller_profile' from source: play vars 13830 1727204082.62610: variable 'controller_profile' from source: play vars 13830 1727204082.62623: variable 'ansible_distribution' from source: facts 13830 1727204082.62630: variable '__network_rh_distros' from source: role '' defaults 13830 1727204082.62640: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.62671: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204082.62866: variable 'ansible_distribution' from source: facts 13830 1727204082.62875: variable '__network_rh_distros' from source: role '' defaults 13830 1727204082.62883: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.62899: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204082.63091: variable 'ansible_distribution' from source: facts 13830 1727204082.63100: variable '__network_rh_distros' from source: role '' defaults 13830 1727204082.63108: variable 'ansible_distribution_major_version' from source: facts 13830 1727204082.63155: variable 'network_provider' from source: set_fact 13830 1727204082.63183: variable 'omit' from source: magic vars 13830 1727204082.63215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204082.63251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204082.63279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204082.63299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204082.63312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204082.63343: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204082.63350: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.63364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.63462: Set connection var ansible_connection to ssh 13830 1727204082.63484: Set connection var ansible_timeout to 10 13830 1727204082.63494: Set connection var ansible_shell_executable to /bin/sh 13830 1727204082.63500: Set connection var ansible_shell_type to sh 13830 1727204082.63509: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204082.63521: Set connection var ansible_pipelining to False 13830 1727204082.63550: variable 'ansible_shell_executable' from source: unknown 13830 1727204082.63558: variable 'ansible_connection' from source: unknown 13830 1727204082.63567: variable 'ansible_module_compression' from source: unknown 13830 1727204082.63573: variable 'ansible_shell_type' from source: unknown 13830 1727204082.63585: variable 'ansible_shell_executable' from source: unknown 13830 1727204082.63591: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204082.63598: variable 'ansible_pipelining' from source: unknown 13830 1727204082.63603: variable 'ansible_timeout' from source: unknown 13830 1727204082.63611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204082.63726: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204082.63742: variable 'omit' from source: magic vars 13830 1727204082.63751: starting attempt loop 13830 1727204082.63757: running the handler 13830 1727204082.63846: variable 'ansible_facts' from source: unknown 13830 1727204082.64835: _low_level_execute_command(): starting 13830 1727204082.64849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204082.65630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204082.65652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204082.65669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204082.65688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204082.65730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204082.65742: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204082.65761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204082.65783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204082.65795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204082.65805: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204082.65815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204082.65827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204082.65843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204082.65855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204082.65870: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204082.65885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204082.65958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204082.65989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204082.66004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204082.66084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204082.67714: stdout chunk (state=3): >>>/root <<< 13830 1727204082.67921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204082.67925: stdout chunk (state=3): >>><<< 13830 1727204082.67932: stderr chunk (state=3): >>><<< 13830 1727204082.68053: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204082.68057: _low_level_execute_command(): starting 13830 1727204082.68061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488 `" && echo ansible-tmp-1727204082.6795323-15033-200163133832488="` echo /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488 `" ) && sleep 0' 13830 1727204082.68695: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204082.68716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204082.68733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204082.68751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204082.68799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204082.68813: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204082.68834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204082.68852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204082.68867: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204082.68879: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204082.68891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204082.68905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204082.68921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204082.68940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204082.68952: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204082.68969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204082.69050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204082.69077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204082.69095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204082.69177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204082.70993: stdout chunk (state=3): >>>ansible-tmp-1727204082.6795323-15033-200163133832488=/root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488 <<< 13830 1727204082.71205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204082.71210: stdout chunk (state=3): >>><<< 13830 1727204082.71213: stderr chunk (state=3): >>><<< 13830 1727204082.71477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204082.6795323-15033-200163133832488=/root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204082.71481: variable 'ansible_module_compression' from source: unknown 13830 1727204082.71485: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 13830 1727204082.71488: ANSIBALLZ: Acquiring lock 13830 1727204082.71490: ANSIBALLZ: Lock acquired: 140043657885840 13830 1727204082.71493: ANSIBALLZ: Creating module 13830 1727204082.94834: ANSIBALLZ: Writing module into payload 13830 1727204082.94980: ANSIBALLZ: Writing module 13830 1727204082.95009: ANSIBALLZ: Renaming module 13830 1727204082.95012: ANSIBALLZ: Done creating module 13830 1727204082.95046: variable 'ansible_facts' from source: unknown 13830 1727204082.95186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488/AnsiballZ_systemd.py 13830 1727204082.95309: Sending initial data 13830 1727204082.95313: Sent initial data (156 bytes) 13830 1727204082.96006: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204082.96015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204082.96068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204082.96072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204082.96074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204082.96122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204082.96131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204082.96203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204082.97997: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204082.98035: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204082.98079: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpeone07ra /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488/AnsiballZ_systemd.py <<< 13830 1727204082.98120: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204082.99867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204083.00182: stderr chunk (state=3): >>><<< 13830 1727204083.00186: stdout chunk (state=3): >>><<< 13830 1727204083.00188: done transferring module to remote 13830 1727204083.00190: _low_level_execute_command(): starting 13830 1727204083.00193: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488/ /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488/AnsiballZ_systemd.py && sleep 0' 13830 1727204083.00755: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204083.00763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.00776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.00790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.00835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.00844: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204083.00854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.00870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204083.00878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204083.00885: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204083.00892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.00902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.00919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.00926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.00937: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204083.00946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.01020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204083.01039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204083.01052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204083.01129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204083.02833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204083.02890: stderr chunk (state=3): >>><<< 13830 1727204083.02893: stdout chunk (state=3): >>><<< 13830 1727204083.02906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204083.02909: _low_level_execute_command(): starting 13830 1727204083.02914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488/AnsiballZ_systemd.py && sleep 0' 13830 1727204083.03915: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.03918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.03966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.03970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204083.03985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.03990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.04071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204083.04079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204083.04091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204083.04176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204083.29018: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15843328", "MemoryAvailable": "infinity", "CPUUsageNSec": "649669000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSig<<< 13830 1727204083.29033: stdout chunk (state=3): >>>nal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13830 1727204083.30641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204083.30646: stdout chunk (state=3): >>><<< 13830 1727204083.30648: stderr chunk (state=3): >>><<< 13830 1727204083.30955: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15843328", "MemoryAvailable": "infinity", "CPUUsageNSec": "649669000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204083.30969: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204083.30972: _low_level_execute_command(): starting 13830 1727204083.30975: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204082.6795323-15033-200163133832488/ > /dev/null 2>&1 && sleep 0' 13830 1727204083.31646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204083.31662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.31682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.31702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.31757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.31773: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204083.31788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.31807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204083.31821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204083.31838: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204083.31851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.31869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.31887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.31899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.31910: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204083.31924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.32011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204083.32028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204083.32050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204083.32135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204083.33872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204083.33935: stderr chunk (state=3): >>><<< 13830 1727204083.33939: stdout chunk (state=3): >>><<< 13830 1727204083.33955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204083.33962: handler run complete 13830 1727204083.34002: attempt loop complete, returning result 13830 1727204083.34006: _execute() done 13830 1727204083.34008: dumping result to json 13830 1727204083.34021: done dumping result, returning 13830 1727204083.34034: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1659-6b02-000000000283] 13830 1727204083.34036: sending task result for task 0affcd87-79f5-1659-6b02-000000000283 13830 1727204083.34271: done sending task result for task 0affcd87-79f5-1659-6b02-000000000283 13830 1727204083.34274: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204083.34324: no more pending results, returning what we have 13830 1727204083.34327: results queue empty 13830 1727204083.34328: checking for any_errors_fatal 13830 1727204083.34336: done checking for any_errors_fatal 13830 1727204083.34337: checking for max_fail_percentage 13830 1727204083.34338: done checking for max_fail_percentage 13830 1727204083.34339: checking to see if all hosts have failed and the running result is not ok 13830 1727204083.34340: done checking to see if all hosts have failed 13830 1727204083.34340: getting the remaining hosts for this loop 13830 1727204083.34342: done getting the remaining hosts for this loop 13830 1727204083.34346: getting the next task for host managed-node3 13830 1727204083.34354: done getting next task for host managed-node3 13830 1727204083.34358: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204083.34363: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204083.34375: getting variables 13830 1727204083.34376: in VariableManager get_vars() 13830 1727204083.34409: Calling all_inventory to load vars for managed-node3 13830 1727204083.34412: Calling groups_inventory to load vars for managed-node3 13830 1727204083.34414: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204083.34423: Calling all_plugins_play to load vars for managed-node3 13830 1727204083.34425: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204083.34427: Calling groups_plugins_play to load vars for managed-node3 13830 1727204083.35976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204083.36948: done with get_vars() 13830 1727204083.36968: done getting variables 13830 1727204083.37014: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.872) 0:00:16.448 ***** 13830 1727204083.37041: entering _queue_task() for managed-node3/service 13830 1727204083.37275: worker is 1 (out of 1 available) 13830 1727204083.37287: exiting _queue_task() for managed-node3/service 13830 1727204083.37300: done queuing things up, now waiting for results queue to drain 13830 1727204083.37302: waiting for pending results... 13830 1727204083.37481: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204083.37584: in run() - task 0affcd87-79f5-1659-6b02-000000000284 13830 1727204083.37594: variable 'ansible_search_path' from source: unknown 13830 1727204083.37599: variable 'ansible_search_path' from source: unknown 13830 1727204083.37633: calling self._execute() 13830 1727204083.37701: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204083.37705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204083.37713: variable 'omit' from source: magic vars 13830 1727204083.38098: variable 'ansible_distribution_major_version' from source: facts 13830 1727204083.38101: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204083.38285: variable 'network_provider' from source: set_fact 13830 1727204083.38289: Evaluated conditional (network_provider == "nm"): True 13830 1727204083.38399: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204083.38402: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204083.38556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204083.40625: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204083.40676: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204083.40704: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204083.40733: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204083.40752: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204083.40822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204083.40845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204083.40866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204083.40894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204083.40906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204083.40939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204083.40956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204083.40975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204083.41002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204083.41013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204083.41044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204083.41060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204083.41078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204083.41105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204083.41114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204083.41215: variable 'network_connections' from source: include params 13830 1727204083.41222: variable 'controller_profile' from source: play vars 13830 1727204083.41277: variable 'controller_profile' from source: play vars 13830 1727204083.41286: variable 'controller_device' from source: play vars 13830 1727204083.41334: variable 'controller_device' from source: play vars 13830 1727204083.41343: variable 'port1_profile' from source: play vars 13830 1727204083.41389: variable 'port1_profile' from source: play vars 13830 1727204083.41395: variable 'dhcp_interface1' from source: play vars 13830 1727204083.41438: variable 'dhcp_interface1' from source: play vars 13830 1727204083.41445: variable 'controller_profile' from source: play vars 13830 1727204083.41490: variable 'controller_profile' from source: play vars 13830 1727204083.41496: variable 'port2_profile' from source: play vars 13830 1727204083.41539: variable 'port2_profile' from source: play vars 13830 1727204083.41546: variable 'dhcp_interface2' from source: play vars 13830 1727204083.41589: variable 'dhcp_interface2' from source: play vars 13830 1727204083.41597: variable 'controller_profile' from source: play vars 13830 1727204083.41638: variable 'controller_profile' from source: play vars 13830 1727204083.41692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204083.41803: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204083.42605: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204083.42609: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204083.42611: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204083.42614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204083.42616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204083.42618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204083.42632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204083.42635: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204083.42637: variable 'network_connections' from source: include params 13830 1727204083.42639: variable 'controller_profile' from source: play vars 13830 1727204083.42641: variable 'controller_profile' from source: play vars 13830 1727204083.42643: variable 'controller_device' from source: play vars 13830 1727204083.42645: variable 'controller_device' from source: play vars 13830 1727204083.42647: variable 'port1_profile' from source: play vars 13830 1727204083.42649: variable 'port1_profile' from source: play vars 13830 1727204083.42651: variable 'dhcp_interface1' from source: play vars 13830 1727204083.42653: variable 'dhcp_interface1' from source: play vars 13830 1727204083.42655: variable 'controller_profile' from source: play vars 13830 1727204083.42657: variable 'controller_profile' from source: play vars 13830 1727204083.42659: variable 'port2_profile' from source: play vars 13830 1727204083.42988: variable 'port2_profile' from source: play vars 13830 1727204083.42991: variable 'dhcp_interface2' from source: play vars 13830 1727204083.42992: variable 'dhcp_interface2' from source: play vars 13830 1727204083.42994: variable 'controller_profile' from source: play vars 13830 1727204083.42996: variable 'controller_profile' from source: play vars 13830 1727204083.42997: Evaluated conditional (__network_wpa_supplicant_required): False 13830 1727204083.42999: when evaluation is False, skipping this task 13830 1727204083.43000: _execute() done 13830 1727204083.43002: dumping result to json 13830 1727204083.43003: done dumping result, returning 13830 1727204083.43005: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1659-6b02-000000000284] 13830 1727204083.43006: sending task result for task 0affcd87-79f5-1659-6b02-000000000284 13830 1727204083.43078: done sending task result for task 0affcd87-79f5-1659-6b02-000000000284 13830 1727204083.43080: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13830 1727204083.43138: no more pending results, returning what we have 13830 1727204083.43141: results queue empty 13830 1727204083.43142: checking for any_errors_fatal 13830 1727204083.43160: done checking for any_errors_fatal 13830 1727204083.43160: checking for max_fail_percentage 13830 1727204083.43162: done checking for max_fail_percentage 13830 1727204083.43162: checking to see if all hosts have failed and the running result is not ok 13830 1727204083.43163: done checking to see if all hosts have failed 13830 1727204083.43165: getting the remaining hosts for this loop 13830 1727204083.43167: done getting the remaining hosts for this loop 13830 1727204083.43171: getting the next task for host managed-node3 13830 1727204083.43176: done getting next task for host managed-node3 13830 1727204083.43181: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204083.43186: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204083.43198: getting variables 13830 1727204083.43199: in VariableManager get_vars() 13830 1727204083.43235: Calling all_inventory to load vars for managed-node3 13830 1727204083.43238: Calling groups_inventory to load vars for managed-node3 13830 1727204083.43240: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204083.43252: Calling all_plugins_play to load vars for managed-node3 13830 1727204083.43255: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204083.43257: Calling groups_plugins_play to load vars for managed-node3 13830 1727204083.44731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204083.45673: done with get_vars() 13830 1727204083.45694: done getting variables 13830 1727204083.45740: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.087) 0:00:16.535 ***** 13830 1727204083.45769: entering _queue_task() for managed-node3/service 13830 1727204083.46005: worker is 1 (out of 1 available) 13830 1727204083.46019: exiting _queue_task() for managed-node3/service 13830 1727204083.46031: done queuing things up, now waiting for results queue to drain 13830 1727204083.46033: waiting for pending results... 13830 1727204083.46218: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204083.46310: in run() - task 0affcd87-79f5-1659-6b02-000000000285 13830 1727204083.46322: variable 'ansible_search_path' from source: unknown 13830 1727204083.46325: variable 'ansible_search_path' from source: unknown 13830 1727204083.46358: calling self._execute() 13830 1727204083.46536: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204083.46540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204083.46543: variable 'omit' from source: magic vars 13830 1727204083.46905: variable 'ansible_distribution_major_version' from source: facts 13830 1727204083.46922: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204083.47048: variable 'network_provider' from source: set_fact 13830 1727204083.47059: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204083.47069: when evaluation is False, skipping this task 13830 1727204083.47077: _execute() done 13830 1727204083.47089: dumping result to json 13830 1727204083.47097: done dumping result, returning 13830 1727204083.47110: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1659-6b02-000000000285] 13830 1727204083.47120: sending task result for task 0affcd87-79f5-1659-6b02-000000000285 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204083.47274: no more pending results, returning what we have 13830 1727204083.47278: results queue empty 13830 1727204083.47279: checking for any_errors_fatal 13830 1727204083.47287: done checking for any_errors_fatal 13830 1727204083.47288: checking for max_fail_percentage 13830 1727204083.47291: done checking for max_fail_percentage 13830 1727204083.47292: checking to see if all hosts have failed and the running result is not ok 13830 1727204083.47292: done checking to see if all hosts have failed 13830 1727204083.47293: getting the remaining hosts for this loop 13830 1727204083.47295: done getting the remaining hosts for this loop 13830 1727204083.47300: getting the next task for host managed-node3 13830 1727204083.47310: done getting next task for host managed-node3 13830 1727204083.47314: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204083.47321: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204083.47339: getting variables 13830 1727204083.47341: in VariableManager get_vars() 13830 1727204083.47382: Calling all_inventory to load vars for managed-node3 13830 1727204083.47386: Calling groups_inventory to load vars for managed-node3 13830 1727204083.47388: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204083.47400: Calling all_plugins_play to load vars for managed-node3 13830 1727204083.47403: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204083.47407: Calling groups_plugins_play to load vars for managed-node3 13830 1727204083.48171: done sending task result for task 0affcd87-79f5-1659-6b02-000000000285 13830 1727204083.48175: WORKER PROCESS EXITING 13830 1727204083.48632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204083.49560: done with get_vars() 13830 1727204083.49582: done getting variables 13830 1727204083.49629: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.038) 0:00:16.574 ***** 13830 1727204083.49657: entering _queue_task() for managed-node3/copy 13830 1727204083.49897: worker is 1 (out of 1 available) 13830 1727204083.49912: exiting _queue_task() for managed-node3/copy 13830 1727204083.49925: done queuing things up, now waiting for results queue to drain 13830 1727204083.49926: waiting for pending results... 13830 1727204083.50115: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204083.50218: in run() - task 0affcd87-79f5-1659-6b02-000000000286 13830 1727204083.50232: variable 'ansible_search_path' from source: unknown 13830 1727204083.50236: variable 'ansible_search_path' from source: unknown 13830 1727204083.50268: calling self._execute() 13830 1727204083.50341: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204083.50345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204083.50355: variable 'omit' from source: magic vars 13830 1727204083.50639: variable 'ansible_distribution_major_version' from source: facts 13830 1727204083.50649: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204083.50738: variable 'network_provider' from source: set_fact 13830 1727204083.50741: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204083.50744: when evaluation is False, skipping this task 13830 1727204083.50747: _execute() done 13830 1727204083.50750: dumping result to json 13830 1727204083.50752: done dumping result, returning 13830 1727204083.50755: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1659-6b02-000000000286] 13830 1727204083.50761: sending task result for task 0affcd87-79f5-1659-6b02-000000000286 13830 1727204083.50857: done sending task result for task 0affcd87-79f5-1659-6b02-000000000286 13830 1727204083.50860: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13830 1727204083.50913: no more pending results, returning what we have 13830 1727204083.50917: results queue empty 13830 1727204083.50917: checking for any_errors_fatal 13830 1727204083.50926: done checking for any_errors_fatal 13830 1727204083.50927: checking for max_fail_percentage 13830 1727204083.50928: done checking for max_fail_percentage 13830 1727204083.50929: checking to see if all hosts have failed and the running result is not ok 13830 1727204083.50932: done checking to see if all hosts have failed 13830 1727204083.50933: getting the remaining hosts for this loop 13830 1727204083.50934: done getting the remaining hosts for this loop 13830 1727204083.50938: getting the next task for host managed-node3 13830 1727204083.50945: done getting next task for host managed-node3 13830 1727204083.50949: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204083.50956: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204083.50978: getting variables 13830 1727204083.50980: in VariableManager get_vars() 13830 1727204083.51013: Calling all_inventory to load vars for managed-node3 13830 1727204083.51015: Calling groups_inventory to load vars for managed-node3 13830 1727204083.51018: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204083.51026: Calling all_plugins_play to load vars for managed-node3 13830 1727204083.51028: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204083.51033: Calling groups_plugins_play to load vars for managed-node3 13830 1727204083.51831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204083.52774: done with get_vars() 13830 1727204083.52794: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:43 -0400 (0:00:00.032) 0:00:16.606 ***** 13830 1727204083.52863: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204083.52866: Creating lock for fedora.linux_system_roles.network_connections 13830 1727204083.53112: worker is 1 (out of 1 available) 13830 1727204083.53126: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204083.53141: done queuing things up, now waiting for results queue to drain 13830 1727204083.53142: waiting for pending results... 13830 1727204083.53325: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204083.53426: in run() - task 0affcd87-79f5-1659-6b02-000000000287 13830 1727204083.53440: variable 'ansible_search_path' from source: unknown 13830 1727204083.53444: variable 'ansible_search_path' from source: unknown 13830 1727204083.53477: calling self._execute() 13830 1727204083.53543: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204083.53547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204083.53556: variable 'omit' from source: magic vars 13830 1727204083.53839: variable 'ansible_distribution_major_version' from source: facts 13830 1727204083.53849: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204083.53855: variable 'omit' from source: magic vars 13830 1727204083.53898: variable 'omit' from source: magic vars 13830 1727204083.54013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204083.55941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204083.56192: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204083.56196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204083.56199: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204083.56201: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204083.56204: variable 'network_provider' from source: set_fact 13830 1727204083.56297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204083.56325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204083.56352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204083.56395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204083.56409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204083.56486: variable 'omit' from source: magic vars 13830 1727204083.56599: variable 'omit' from source: magic vars 13830 1727204083.56700: variable 'network_connections' from source: include params 13830 1727204083.56711: variable 'controller_profile' from source: play vars 13830 1727204083.56773: variable 'controller_profile' from source: play vars 13830 1727204083.56779: variable 'controller_device' from source: play vars 13830 1727204083.56839: variable 'controller_device' from source: play vars 13830 1727204083.56853: variable 'port1_profile' from source: play vars 13830 1727204083.56911: variable 'port1_profile' from source: play vars 13830 1727204083.56918: variable 'dhcp_interface1' from source: play vars 13830 1727204083.56978: variable 'dhcp_interface1' from source: play vars 13830 1727204083.56984: variable 'controller_profile' from source: play vars 13830 1727204083.57046: variable 'controller_profile' from source: play vars 13830 1727204083.57054: variable 'port2_profile' from source: play vars 13830 1727204083.57138: variable 'port2_profile' from source: play vars 13830 1727204083.57144: variable 'dhcp_interface2' from source: play vars 13830 1727204083.57203: variable 'dhcp_interface2' from source: play vars 13830 1727204083.57210: variable 'controller_profile' from source: play vars 13830 1727204083.57270: variable 'controller_profile' from source: play vars 13830 1727204083.57486: variable 'omit' from source: magic vars 13830 1727204083.57494: variable '__lsr_ansible_managed' from source: task vars 13830 1727204083.57556: variable '__lsr_ansible_managed' from source: task vars 13830 1727204083.57741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13830 1727204083.57961: Loaded config def from plugin (lookup/template) 13830 1727204083.57968: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13830 1727204083.57997: File lookup term: get_ansible_managed.j2 13830 1727204083.58000: variable 'ansible_search_path' from source: unknown 13830 1727204083.58003: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13830 1727204083.58027: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13830 1727204083.58030: variable 'ansible_search_path' from source: unknown 13830 1727204083.63709: variable 'ansible_managed' from source: unknown 13830 1727204083.63863: variable 'omit' from source: magic vars 13830 1727204083.63891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204083.63918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204083.63938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204083.63955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204083.63966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204083.63993: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204083.63997: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204083.63999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204083.64095: Set connection var ansible_connection to ssh 13830 1727204083.64105: Set connection var ansible_timeout to 10 13830 1727204083.64111: Set connection var ansible_shell_executable to /bin/sh 13830 1727204083.64113: Set connection var ansible_shell_type to sh 13830 1727204083.64120: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204083.64133: Set connection var ansible_pipelining to False 13830 1727204083.64151: variable 'ansible_shell_executable' from source: unknown 13830 1727204083.64154: variable 'ansible_connection' from source: unknown 13830 1727204083.64157: variable 'ansible_module_compression' from source: unknown 13830 1727204083.64159: variable 'ansible_shell_type' from source: unknown 13830 1727204083.64161: variable 'ansible_shell_executable' from source: unknown 13830 1727204083.64165: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204083.64170: variable 'ansible_pipelining' from source: unknown 13830 1727204083.64173: variable 'ansible_timeout' from source: unknown 13830 1727204083.64177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204083.64305: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204083.64314: variable 'omit' from source: magic vars 13830 1727204083.64323: starting attempt loop 13830 1727204083.64326: running the handler 13830 1727204083.64340: _low_level_execute_command(): starting 13830 1727204083.64347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204083.65214: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204083.65239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.65248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.65262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.65304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.65310: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204083.65320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.65335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204083.65342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204083.65349: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204083.65356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.65367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.65379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.65386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.65393: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204083.65402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.65474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204083.65492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204083.65504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204083.65575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204083.67280: stdout chunk (state=3): >>>/root <<< 13830 1727204083.67473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204083.67476: stdout chunk (state=3): >>><<< 13830 1727204083.67479: stderr chunk (state=3): >>><<< 13830 1727204083.67481: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204083.67484: _low_level_execute_command(): starting 13830 1727204083.67486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710 `" && echo ansible-tmp-1727204083.6740572-15071-279132161117710="` echo /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710 `" ) && sleep 0' 13830 1727204083.69428: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204083.69453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.69478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.69496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.69543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.69555: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204083.69576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.69595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204083.69607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204083.69618: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204083.69632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204083.69646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204083.69661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204083.69674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204083.69688: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204083.69701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204083.69791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204083.69815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204083.69832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204083.69927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204083.71747: stdout chunk (state=3): >>>ansible-tmp-1727204083.6740572-15071-279132161117710=/root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710 <<< 13830 1727204083.72060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204083.72066: stdout chunk (state=3): >>><<< 13830 1727204083.72069: stderr chunk (state=3): >>><<< 13830 1727204083.72276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204083.6740572-15071-279132161117710=/root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204083.72280: variable 'ansible_module_compression' from source: unknown 13830 1727204083.72283: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 13830 1727204083.72285: ANSIBALLZ: Acquiring lock 13830 1727204083.72287: ANSIBALLZ: Lock acquired: 140043654407056 13830 1727204083.72289: ANSIBALLZ: Creating module 13830 1727204083.99076: ANSIBALLZ: Writing module into payload 13830 1727204083.99414: ANSIBALLZ: Writing module 13830 1727204083.99439: ANSIBALLZ: Renaming module 13830 1727204083.99443: ANSIBALLZ: Done creating module 13830 1727204083.99468: variable 'ansible_facts' from source: unknown 13830 1727204083.99534: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710/AnsiballZ_network_connections.py 13830 1727204083.99645: Sending initial data 13830 1727204083.99648: Sent initial data (168 bytes) 13830 1727204084.00374: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204084.00378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204084.00411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204084.00414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204084.00417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204084.00467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204084.00479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204084.00550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204084.02370: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204084.02404: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204084.02443: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmplwjmsz54 /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710/AnsiballZ_network_connections.py <<< 13830 1727204084.02483: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204084.03633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204084.03752: stderr chunk (state=3): >>><<< 13830 1727204084.03756: stdout chunk (state=3): >>><<< 13830 1727204084.03777: done transferring module to remote 13830 1727204084.03788: _low_level_execute_command(): starting 13830 1727204084.03794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710/ /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710/AnsiballZ_network_connections.py && sleep 0' 13830 1727204084.04276: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204084.04279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204084.04292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204084.04320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204084.04336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204084.04349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204084.04399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204084.04408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204084.04412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204084.04471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204084.06154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204084.06211: stderr chunk (state=3): >>><<< 13830 1727204084.06214: stdout chunk (state=3): >>><<< 13830 1727204084.06232: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204084.06240: _low_level_execute_command(): starting 13830 1727204084.06243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710/AnsiballZ_network_connections.py && sleep 0' 13830 1727204084.06720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204084.06733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204084.06755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204084.06770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204084.06827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204084.06841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204084.06894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204084.45378: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13830 1727204084.47095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204084.47192: stderr chunk (state=3): >>><<< 13830 1727204084.47196: stdout chunk (state=3): >>><<< 13830 1727204084.47370: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204084.47380: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': '802.3ad', 'ad_actor_sys_prio': 65535, 'ad_actor_system': '00:00:5e:00:53:5d', 'ad_select': 'stable', 'ad_user_port_key': 1023, 'all_ports_active': True, 'downdelay': 0, 'lacp_rate': 'slow', 'lp_interval': 128, 'miimon': 110, 'min_links': 0, 'num_grat_arp': 64, 'primary_reselect': 'better', 'resend_igmp': 225, 'updelay': 0, 'use_carrier': True, 'xmit_hash_policy': 'encap2+3'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204084.47382: _low_level_execute_command(): starting 13830 1727204084.47422: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204083.6740572-15071-279132161117710/ > /dev/null 2>&1 && sleep 0' 13830 1727204084.49850: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204084.49884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204084.49898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204084.49916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204084.50113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204084.50126: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204084.50144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204084.50161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204084.50175: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204084.50185: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204084.50196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204084.50207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204084.50221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204084.50237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204084.50248: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204084.50266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204084.50346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204084.50363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204084.50380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204084.50643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204084.52573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204084.52577: stdout chunk (state=3): >>><<< 13830 1727204084.52587: stderr chunk (state=3): >>><<< 13830 1727204084.52773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204084.52776: handler run complete 13830 1727204084.52778: attempt loop complete, returning result 13830 1727204084.52780: _execute() done 13830 1727204084.52782: dumping result to json 13830 1727204084.52783: done dumping result, returning 13830 1727204084.52785: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1659-6b02-000000000287] 13830 1727204084.52786: sending task result for task 0affcd87-79f5-1659-6b02-000000000287 changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active) 13830 1727204084.53035: no more pending results, returning what we have 13830 1727204084.53039: results queue empty 13830 1727204084.53040: checking for any_errors_fatal 13830 1727204084.53046: done checking for any_errors_fatal 13830 1727204084.53046: checking for max_fail_percentage 13830 1727204084.53048: done checking for max_fail_percentage 13830 1727204084.53049: checking to see if all hosts have failed and the running result is not ok 13830 1727204084.53050: done checking to see if all hosts have failed 13830 1727204084.53050: getting the remaining hosts for this loop 13830 1727204084.53052: done getting the remaining hosts for this loop 13830 1727204084.53056: getting the next task for host managed-node3 13830 1727204084.53065: done getting next task for host managed-node3 13830 1727204084.53069: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204084.53074: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204084.53085: getting variables 13830 1727204084.53087: in VariableManager get_vars() 13830 1727204084.53123: Calling all_inventory to load vars for managed-node3 13830 1727204084.53126: Calling groups_inventory to load vars for managed-node3 13830 1727204084.53128: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204084.53139: Calling all_plugins_play to load vars for managed-node3 13830 1727204084.53142: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204084.53145: Calling groups_plugins_play to load vars for managed-node3 13830 1727204084.53666: done sending task result for task 0affcd87-79f5-1659-6b02-000000000287 13830 1727204084.53670: WORKER PROCESS EXITING 13830 1727204084.56199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204084.60974: done with get_vars() 13830 1727204084.61008: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:44 -0400 (0:00:01.082) 0:00:17.689 ***** 13830 1727204084.61099: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204084.61101: Creating lock for fedora.linux_system_roles.network_state 13830 1727204084.61829: worker is 1 (out of 1 available) 13830 1727204084.61844: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204084.61856: done queuing things up, now waiting for results queue to drain 13830 1727204084.61858: waiting for pending results... 13830 1727204084.62539: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204084.63133: in run() - task 0affcd87-79f5-1659-6b02-000000000288 13830 1727204084.63154: variable 'ansible_search_path' from source: unknown 13830 1727204084.63160: variable 'ansible_search_path' from source: unknown 13830 1727204084.63203: calling self._execute() 13830 1727204084.63292: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.63309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.63324: variable 'omit' from source: magic vars 13830 1727204084.63716: variable 'ansible_distribution_major_version' from source: facts 13830 1727204084.63738: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204084.64067: variable 'network_state' from source: role '' defaults 13830 1727204084.64082: Evaluated conditional (network_state != {}): False 13830 1727204084.64090: when evaluation is False, skipping this task 13830 1727204084.64097: _execute() done 13830 1727204084.64104: dumping result to json 13830 1727204084.64112: done dumping result, returning 13830 1727204084.64123: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1659-6b02-000000000288] 13830 1727204084.64134: sending task result for task 0affcd87-79f5-1659-6b02-000000000288 13830 1727204084.64362: done sending task result for task 0affcd87-79f5-1659-6b02-000000000288 13830 1727204084.64375: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204084.64435: no more pending results, returning what we have 13830 1727204084.64439: results queue empty 13830 1727204084.64439: checking for any_errors_fatal 13830 1727204084.64454: done checking for any_errors_fatal 13830 1727204084.64455: checking for max_fail_percentage 13830 1727204084.64457: done checking for max_fail_percentage 13830 1727204084.64457: checking to see if all hosts have failed and the running result is not ok 13830 1727204084.64458: done checking to see if all hosts have failed 13830 1727204084.64458: getting the remaining hosts for this loop 13830 1727204084.64460: done getting the remaining hosts for this loop 13830 1727204084.64466: getting the next task for host managed-node3 13830 1727204084.64474: done getting next task for host managed-node3 13830 1727204084.64478: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204084.64485: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204084.64503: getting variables 13830 1727204084.64505: in VariableManager get_vars() 13830 1727204084.64548: Calling all_inventory to load vars for managed-node3 13830 1727204084.64551: Calling groups_inventory to load vars for managed-node3 13830 1727204084.64554: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204084.64568: Calling all_plugins_play to load vars for managed-node3 13830 1727204084.64571: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204084.64575: Calling groups_plugins_play to load vars for managed-node3 13830 1727204084.68002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204084.71334: done with get_vars() 13830 1727204084.71776: done getting variables 13830 1727204084.71841: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.107) 0:00:17.796 ***** 13830 1727204084.71880: entering _queue_task() for managed-node3/debug 13830 1727204084.72521: worker is 1 (out of 1 available) 13830 1727204084.72538: exiting _queue_task() for managed-node3/debug 13830 1727204084.72549: done queuing things up, now waiting for results queue to drain 13830 1727204084.72550: waiting for pending results... 13830 1727204084.73338: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204084.73472: in run() - task 0affcd87-79f5-1659-6b02-000000000289 13830 1727204084.73494: variable 'ansible_search_path' from source: unknown 13830 1727204084.73501: variable 'ansible_search_path' from source: unknown 13830 1727204084.73541: calling self._execute() 13830 1727204084.73640: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.74278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.74296: variable 'omit' from source: magic vars 13830 1727204084.74657: variable 'ansible_distribution_major_version' from source: facts 13830 1727204084.74676: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204084.74687: variable 'omit' from source: magic vars 13830 1727204084.74754: variable 'omit' from source: magic vars 13830 1727204084.74795: variable 'omit' from source: magic vars 13830 1727204084.74842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204084.74882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204084.74907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204084.75536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204084.75552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204084.75590: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204084.75597: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.75604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.75705: Set connection var ansible_connection to ssh 13830 1727204084.75720: Set connection var ansible_timeout to 10 13830 1727204084.75729: Set connection var ansible_shell_executable to /bin/sh 13830 1727204084.75735: Set connection var ansible_shell_type to sh 13830 1727204084.75743: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204084.75756: Set connection var ansible_pipelining to False 13830 1727204084.75785: variable 'ansible_shell_executable' from source: unknown 13830 1727204084.75792: variable 'ansible_connection' from source: unknown 13830 1727204084.75798: variable 'ansible_module_compression' from source: unknown 13830 1727204084.75804: variable 'ansible_shell_type' from source: unknown 13830 1727204084.75810: variable 'ansible_shell_executable' from source: unknown 13830 1727204084.75816: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.75822: variable 'ansible_pipelining' from source: unknown 13830 1727204084.75828: variable 'ansible_timeout' from source: unknown 13830 1727204084.75835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.75972: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204084.75988: variable 'omit' from source: magic vars 13830 1727204084.75997: starting attempt loop 13830 1727204084.76003: running the handler 13830 1727204084.76144: variable '__network_connections_result' from source: set_fact 13830 1727204084.76222: handler run complete 13830 1727204084.76249: attempt loop complete, returning result 13830 1727204084.76256: _execute() done 13830 1727204084.76261: dumping result to json 13830 1727204084.76271: done dumping result, returning 13830 1727204084.76282: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1659-6b02-000000000289] 13830 1727204084.76291: sending task result for task 0affcd87-79f5-1659-6b02-000000000289 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active)" ] } 13830 1727204084.76468: no more pending results, returning what we have 13830 1727204084.76472: results queue empty 13830 1727204084.76473: checking for any_errors_fatal 13830 1727204084.76480: done checking for any_errors_fatal 13830 1727204084.76481: checking for max_fail_percentage 13830 1727204084.76482: done checking for max_fail_percentage 13830 1727204084.76483: checking to see if all hosts have failed and the running result is not ok 13830 1727204084.76483: done checking to see if all hosts have failed 13830 1727204084.76484: getting the remaining hosts for this loop 13830 1727204084.76486: done getting the remaining hosts for this loop 13830 1727204084.76490: getting the next task for host managed-node3 13830 1727204084.76497: done getting next task for host managed-node3 13830 1727204084.76500: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204084.76506: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204084.76519: done sending task result for task 0affcd87-79f5-1659-6b02-000000000289 13830 1727204084.76524: WORKER PROCESS EXITING 13830 1727204084.76532: getting variables 13830 1727204084.76534: in VariableManager get_vars() 13830 1727204084.76571: Calling all_inventory to load vars for managed-node3 13830 1727204084.76573: Calling groups_inventory to load vars for managed-node3 13830 1727204084.76576: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204084.76585: Calling all_plugins_play to load vars for managed-node3 13830 1727204084.76594: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204084.76597: Calling groups_plugins_play to load vars for managed-node3 13830 1727204084.78379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204084.81638: done with get_vars() 13830 1727204084.81673: done getting variables 13830 1727204084.81735: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.108) 0:00:17.905 ***** 13830 1727204084.82744: entering _queue_task() for managed-node3/debug 13830 1727204084.83314: worker is 1 (out of 1 available) 13830 1727204084.83326: exiting _queue_task() for managed-node3/debug 13830 1727204084.83341: done queuing things up, now waiting for results queue to drain 13830 1727204084.83343: waiting for pending results... 13830 1727204084.83699: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204084.83834: in run() - task 0affcd87-79f5-1659-6b02-00000000028a 13830 1727204084.83851: variable 'ansible_search_path' from source: unknown 13830 1727204084.83855: variable 'ansible_search_path' from source: unknown 13830 1727204084.83901: calling self._execute() 13830 1727204084.83992: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.83996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.84010: variable 'omit' from source: magic vars 13830 1727204084.84639: variable 'ansible_distribution_major_version' from source: facts 13830 1727204084.84650: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204084.84773: variable 'omit' from source: magic vars 13830 1727204084.84846: variable 'omit' from source: magic vars 13830 1727204084.84990: variable 'omit' from source: magic vars 13830 1727204084.85032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204084.85069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204084.85223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204084.85242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204084.85253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204084.85284: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204084.85287: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.85289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.85577: Set connection var ansible_connection to ssh 13830 1727204084.85587: Set connection var ansible_timeout to 10 13830 1727204084.85593: Set connection var ansible_shell_executable to /bin/sh 13830 1727204084.85595: Set connection var ansible_shell_type to sh 13830 1727204084.85601: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204084.85611: Set connection var ansible_pipelining to False 13830 1727204084.85711: variable 'ansible_shell_executable' from source: unknown 13830 1727204084.85714: variable 'ansible_connection' from source: unknown 13830 1727204084.85716: variable 'ansible_module_compression' from source: unknown 13830 1727204084.85719: variable 'ansible_shell_type' from source: unknown 13830 1727204084.85720: variable 'ansible_shell_executable' from source: unknown 13830 1727204084.85722: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.85726: variable 'ansible_pipelining' from source: unknown 13830 1727204084.85728: variable 'ansible_timeout' from source: unknown 13830 1727204084.85730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.85828: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204084.85843: variable 'omit' from source: magic vars 13830 1727204084.85854: starting attempt loop 13830 1727204084.85857: running the handler 13830 1727204084.85907: variable '__network_connections_result' from source: set_fact 13830 1727204084.86118: variable '__network_connections_result' from source: set_fact 13830 1727204084.86684: handler run complete 13830 1727204084.86722: attempt loop complete, returning result 13830 1727204084.86845: _execute() done 13830 1727204084.86848: dumping result to json 13830 1727204084.86853: done dumping result, returning 13830 1727204084.86867: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1659-6b02-00000000028a] 13830 1727204084.86870: sending task result for task 0affcd87-79f5-1659-6b02-00000000028a 13830 1727204084.87001: done sending task result for task 0affcd87-79f5-1659-6b02-00000000028a 13830 1727204084.87005: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active)" ] } } 13830 1727204084.87146: no more pending results, returning what we have 13830 1727204084.87151: results queue empty 13830 1727204084.87152: checking for any_errors_fatal 13830 1727204084.87158: done checking for any_errors_fatal 13830 1727204084.87160: checking for max_fail_percentage 13830 1727204084.87161: done checking for max_fail_percentage 13830 1727204084.87162: checking to see if all hosts have failed and the running result is not ok 13830 1727204084.87163: done checking to see if all hosts have failed 13830 1727204084.87165: getting the remaining hosts for this loop 13830 1727204084.87167: done getting the remaining hosts for this loop 13830 1727204084.87172: getting the next task for host managed-node3 13830 1727204084.87179: done getting next task for host managed-node3 13830 1727204084.87183: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204084.87189: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204084.87199: getting variables 13830 1727204084.87201: in VariableManager get_vars() 13830 1727204084.87240: Calling all_inventory to load vars for managed-node3 13830 1727204084.87244: Calling groups_inventory to load vars for managed-node3 13830 1727204084.87246: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204084.87258: Calling all_plugins_play to load vars for managed-node3 13830 1727204084.87261: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204084.87267: Calling groups_plugins_play to load vars for managed-node3 13830 1727204084.89633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204084.92285: done with get_vars() 13830 1727204084.92320: done getting variables 13830 1727204084.92386: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.096) 0:00:18.002 ***** 13830 1727204084.92422: entering _queue_task() for managed-node3/debug 13830 1727204084.93373: worker is 1 (out of 1 available) 13830 1727204084.93386: exiting _queue_task() for managed-node3/debug 13830 1727204084.93398: done queuing things up, now waiting for results queue to drain 13830 1727204084.93400: waiting for pending results... 13830 1727204084.94646: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204084.94918: in run() - task 0affcd87-79f5-1659-6b02-00000000028b 13830 1727204084.95014: variable 'ansible_search_path' from source: unknown 13830 1727204084.95022: variable 'ansible_search_path' from source: unknown 13830 1727204084.95070: calling self._execute() 13830 1727204084.95187: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204084.95200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204084.95218: variable 'omit' from source: magic vars 13830 1727204084.95595: variable 'ansible_distribution_major_version' from source: facts 13830 1727204084.95614: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204084.95740: variable 'network_state' from source: role '' defaults 13830 1727204084.95755: Evaluated conditional (network_state != {}): False 13830 1727204084.95762: when evaluation is False, skipping this task 13830 1727204084.95772: _execute() done 13830 1727204084.95780: dumping result to json 13830 1727204084.95786: done dumping result, returning 13830 1727204084.95797: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1659-6b02-00000000028b] 13830 1727204084.95811: sending task result for task 0affcd87-79f5-1659-6b02-00000000028b skipping: [managed-node3] => { "false_condition": "network_state != {}" } 13830 1727204084.95957: no more pending results, returning what we have 13830 1727204084.95961: results queue empty 13830 1727204084.95962: checking for any_errors_fatal 13830 1727204084.95973: done checking for any_errors_fatal 13830 1727204084.95974: checking for max_fail_percentage 13830 1727204084.95976: done checking for max_fail_percentage 13830 1727204084.95977: checking to see if all hosts have failed and the running result is not ok 13830 1727204084.95978: done checking to see if all hosts have failed 13830 1727204084.95978: getting the remaining hosts for this loop 13830 1727204084.95980: done getting the remaining hosts for this loop 13830 1727204084.95985: getting the next task for host managed-node3 13830 1727204084.95993: done getting next task for host managed-node3 13830 1727204084.95997: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204084.96004: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204084.96022: getting variables 13830 1727204084.96024: in VariableManager get_vars() 13830 1727204084.96061: Calling all_inventory to load vars for managed-node3 13830 1727204084.96066: Calling groups_inventory to load vars for managed-node3 13830 1727204084.96069: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204084.96080: Calling all_plugins_play to load vars for managed-node3 13830 1727204084.96082: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204084.96085: Calling groups_plugins_play to load vars for managed-node3 13830 1727204084.97112: done sending task result for task 0affcd87-79f5-1659-6b02-00000000028b 13830 1727204084.97115: WORKER PROCESS EXITING 13830 1727204084.98098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204085.00229: done with get_vars() 13830 1727204085.00261: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.079) 0:00:18.081 ***** 13830 1727204085.00374: entering _queue_task() for managed-node3/ping 13830 1727204085.00380: Creating lock for ping 13830 1727204085.00717: worker is 1 (out of 1 available) 13830 1727204085.00730: exiting _queue_task() for managed-node3/ping 13830 1727204085.00741: done queuing things up, now waiting for results queue to drain 13830 1727204085.00743: waiting for pending results... 13830 1727204085.01028: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204085.01263: in run() - task 0affcd87-79f5-1659-6b02-00000000028c 13830 1727204085.01292: variable 'ansible_search_path' from source: unknown 13830 1727204085.01304: variable 'ansible_search_path' from source: unknown 13830 1727204085.01344: calling self._execute() 13830 1727204085.01443: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.01456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.01473: variable 'omit' from source: magic vars 13830 1727204085.01949: variable 'ansible_distribution_major_version' from source: facts 13830 1727204085.01973: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204085.01985: variable 'omit' from source: magic vars 13830 1727204085.02166: variable 'omit' from source: magic vars 13830 1727204085.02205: variable 'omit' from source: magic vars 13830 1727204085.02249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204085.02413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204085.02440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204085.02462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204085.02506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204085.02545: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204085.02596: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.02700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.02829: Set connection var ansible_connection to ssh 13830 1727204085.02887: Set connection var ansible_timeout to 10 13830 1727204085.02898: Set connection var ansible_shell_executable to /bin/sh 13830 1727204085.02905: Set connection var ansible_shell_type to sh 13830 1727204085.02920: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204085.02939: Set connection var ansible_pipelining to False 13830 1727204085.02968: variable 'ansible_shell_executable' from source: unknown 13830 1727204085.02977: variable 'ansible_connection' from source: unknown 13830 1727204085.02985: variable 'ansible_module_compression' from source: unknown 13830 1727204085.02991: variable 'ansible_shell_type' from source: unknown 13830 1727204085.02997: variable 'ansible_shell_executable' from source: unknown 13830 1727204085.03003: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.03010: variable 'ansible_pipelining' from source: unknown 13830 1727204085.03019: variable 'ansible_timeout' from source: unknown 13830 1727204085.03030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.03247: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204085.03268: variable 'omit' from source: magic vars 13830 1727204085.03279: starting attempt loop 13830 1727204085.03286: running the handler 13830 1727204085.03305: _low_level_execute_command(): starting 13830 1727204085.03318: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204085.04083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204085.04100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.04121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.04143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.04189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.04201: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204085.04214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.04238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204085.04251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204085.04262: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204085.04277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.04290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.04306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.04320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.04332: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204085.04353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.04429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204085.04456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204085.04481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204085.04561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204085.06192: stdout chunk (state=3): >>>/root <<< 13830 1727204085.06369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204085.06398: stdout chunk (state=3): >>><<< 13830 1727204085.06401: stderr chunk (state=3): >>><<< 13830 1727204085.06516: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204085.06520: _low_level_execute_command(): starting 13830 1727204085.06522: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029 `" && echo ansible-tmp-1727204085.0642333-15134-1865938675029="` echo /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029 `" ) && sleep 0' 13830 1727204085.07214: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204085.07231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.07246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.07262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.07313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.07329: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204085.07346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.07365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204085.07377: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204085.07391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204085.07403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.07416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.07438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.07450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.07463: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204085.07479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.07567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204085.07589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204085.07605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204085.07684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204085.09495: stdout chunk (state=3): >>>ansible-tmp-1727204085.0642333-15134-1865938675029=/root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029 <<< 13830 1727204085.09699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204085.09702: stdout chunk (state=3): >>><<< 13830 1727204085.09705: stderr chunk (state=3): >>><<< 13830 1727204085.10075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204085.0642333-15134-1865938675029=/root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204085.10079: variable 'ansible_module_compression' from source: unknown 13830 1727204085.10082: ANSIBALLZ: Using lock for ping 13830 1727204085.10084: ANSIBALLZ: Acquiring lock 13830 1727204085.10086: ANSIBALLZ: Lock acquired: 140043691562608 13830 1727204085.10088: ANSIBALLZ: Creating module 13830 1727204085.23985: ANSIBALLZ: Writing module into payload 13830 1727204085.24062: ANSIBALLZ: Writing module 13830 1727204085.24098: ANSIBALLZ: Renaming module 13830 1727204085.24113: ANSIBALLZ: Done creating module 13830 1727204085.24133: variable 'ansible_facts' from source: unknown 13830 1727204085.24210: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029/AnsiballZ_ping.py 13830 1727204085.24384: Sending initial data 13830 1727204085.24387: Sent initial data (151 bytes) 13830 1727204085.25448: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204085.25466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.25484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.25506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.25553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.25568: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204085.25584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.25605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204085.25619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204085.25635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204085.25648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.25662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.25681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.25695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.25707: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204085.25724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.25805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204085.25832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204085.25854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204085.25943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204085.27765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204085.27788: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204085.27825: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp9253d0kn /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029/AnsiballZ_ping.py <<< 13830 1727204085.27856: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204085.29133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204085.29260: stderr chunk (state=3): >>><<< 13830 1727204085.29263: stdout chunk (state=3): >>><<< 13830 1727204085.29268: done transferring module to remote 13830 1727204085.29270: _low_level_execute_command(): starting 13830 1727204085.29273: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029/ /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029/AnsiballZ_ping.py && sleep 0' 13830 1727204085.29841: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204085.29856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.29875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.29897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.29941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.29951: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204085.29963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.29982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204085.29995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204085.30005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204085.30015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.30026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.30042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.30052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.30061: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204085.30075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.30149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204085.30170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204085.30186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204085.30258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204085.32049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204085.32053: stdout chunk (state=3): >>><<< 13830 1727204085.32055: stderr chunk (state=3): >>><<< 13830 1727204085.32145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204085.32149: _low_level_execute_command(): starting 13830 1727204085.32152: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029/AnsiballZ_ping.py && sleep 0' 13830 1727204085.33454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204085.33685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.33700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.33716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.33761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.33776: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204085.33789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.33805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204085.33816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204085.33825: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204085.33836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.33851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.33984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.33996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.34006: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204085.34020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.34098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204085.34114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204085.34128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204085.34480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204085.47102: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13830 1727204085.48188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204085.48274: stderr chunk (state=3): >>><<< 13830 1727204085.48278: stdout chunk (state=3): >>><<< 13830 1727204085.48400: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204085.48410: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204085.48412: _low_level_execute_command(): starting 13830 1727204085.48414: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204085.0642333-15134-1865938675029/ > /dev/null 2>&1 && sleep 0' 13830 1727204085.50318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204085.50423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.50438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.50455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.50539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.50598: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204085.50624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.50660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204085.50706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204085.50717: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204085.50728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204085.50741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204085.50756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204085.50768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204085.50780: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204085.50792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204085.50880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204085.50979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204085.50997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204085.51093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204085.52968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204085.52972: stdout chunk (state=3): >>><<< 13830 1727204085.52976: stderr chunk (state=3): >>><<< 13830 1727204085.53020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204085.53024: handler run complete 13830 1727204085.53041: attempt loop complete, returning result 13830 1727204085.53044: _execute() done 13830 1727204085.53047: dumping result to json 13830 1727204085.53049: done dumping result, returning 13830 1727204085.53058: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1659-6b02-00000000028c] 13830 1727204085.53063: sending task result for task 0affcd87-79f5-1659-6b02-00000000028c 13830 1727204085.53157: done sending task result for task 0affcd87-79f5-1659-6b02-00000000028c 13830 1727204085.53159: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 13830 1727204085.53228: no more pending results, returning what we have 13830 1727204085.53235: results queue empty 13830 1727204085.53235: checking for any_errors_fatal 13830 1727204085.53241: done checking for any_errors_fatal 13830 1727204085.53242: checking for max_fail_percentage 13830 1727204085.53243: done checking for max_fail_percentage 13830 1727204085.53244: checking to see if all hosts have failed and the running result is not ok 13830 1727204085.53245: done checking to see if all hosts have failed 13830 1727204085.53245: getting the remaining hosts for this loop 13830 1727204085.53247: done getting the remaining hosts for this loop 13830 1727204085.53251: getting the next task for host managed-node3 13830 1727204085.53261: done getting next task for host managed-node3 13830 1727204085.53263: ^ task is: TASK: meta (role_complete) 13830 1727204085.53271: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204085.53282: getting variables 13830 1727204085.53284: in VariableManager get_vars() 13830 1727204085.53323: Calling all_inventory to load vars for managed-node3 13830 1727204085.53326: Calling groups_inventory to load vars for managed-node3 13830 1727204085.53329: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204085.53343: Calling all_plugins_play to load vars for managed-node3 13830 1727204085.53346: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204085.53350: Calling groups_plugins_play to load vars for managed-node3 13830 1727204085.55601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204085.57534: done with get_vars() 13830 1727204085.57578: done getting variables 13830 1727204085.57673: done queuing things up, now waiting for results queue to drain 13830 1727204085.57675: results queue empty 13830 1727204085.57676: checking for any_errors_fatal 13830 1727204085.57725: done checking for any_errors_fatal 13830 1727204085.57727: checking for max_fail_percentage 13830 1727204085.57728: done checking for max_fail_percentage 13830 1727204085.57729: checking to see if all hosts have failed and the running result is not ok 13830 1727204085.57732: done checking to see if all hosts have failed 13830 1727204085.57733: getting the remaining hosts for this loop 13830 1727204085.57734: done getting the remaining hosts for this loop 13830 1727204085.57738: getting the next task for host managed-node3 13830 1727204085.57742: done getting next task for host managed-node3 13830 1727204085.57744: ^ task is: TASK: Show result 13830 1727204085.57746: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204085.57749: getting variables 13830 1727204085.57750: in VariableManager get_vars() 13830 1727204085.57764: Calling all_inventory to load vars for managed-node3 13830 1727204085.57768: Calling groups_inventory to load vars for managed-node3 13830 1727204085.57826: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204085.57835: Calling all_plugins_play to load vars for managed-node3 13830 1727204085.57837: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204085.57840: Calling groups_plugins_play to load vars for managed-node3 13830 1727204085.60111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204085.62203: done with get_vars() 13830 1727204085.62227: done getting variables 13830 1727204085.62274: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:46 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.619) 0:00:18.701 ***** 13830 1727204085.62310: entering _queue_task() for managed-node3/debug 13830 1727204085.62672: worker is 1 (out of 1 available) 13830 1727204085.62686: exiting _queue_task() for managed-node3/debug 13830 1727204085.62696: done queuing things up, now waiting for results queue to drain 13830 1727204085.62698: waiting for pending results... 13830 1727204085.63004: running TaskExecutor() for managed-node3/TASK: Show result 13830 1727204085.63128: in run() - task 0affcd87-79f5-1659-6b02-0000000001c6 13830 1727204085.63158: variable 'ansible_search_path' from source: unknown 13830 1727204085.63169: variable 'ansible_search_path' from source: unknown 13830 1727204085.63216: calling self._execute() 13830 1727204085.63322: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.63338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.63356: variable 'omit' from source: magic vars 13830 1727204085.64888: variable 'ansible_distribution_major_version' from source: facts 13830 1727204085.64909: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204085.64943: variable 'omit' from source: magic vars 13830 1727204085.65055: variable 'omit' from source: magic vars 13830 1727204085.65099: variable 'omit' from source: magic vars 13830 1727204085.65272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204085.65317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204085.65349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204085.65375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204085.65392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204085.65436: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204085.65446: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.65454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.65567: Set connection var ansible_connection to ssh 13830 1727204085.65585: Set connection var ansible_timeout to 10 13830 1727204085.65596: Set connection var ansible_shell_executable to /bin/sh 13830 1727204085.65603: Set connection var ansible_shell_type to sh 13830 1727204085.65614: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204085.65629: Set connection var ansible_pipelining to False 13830 1727204085.65668: variable 'ansible_shell_executable' from source: unknown 13830 1727204085.65677: variable 'ansible_connection' from source: unknown 13830 1727204085.65684: variable 'ansible_module_compression' from source: unknown 13830 1727204085.65690: variable 'ansible_shell_type' from source: unknown 13830 1727204085.65696: variable 'ansible_shell_executable' from source: unknown 13830 1727204085.65703: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.65709: variable 'ansible_pipelining' from source: unknown 13830 1727204085.65716: variable 'ansible_timeout' from source: unknown 13830 1727204085.65723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.65880: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204085.65897: variable 'omit' from source: magic vars 13830 1727204085.65907: starting attempt loop 13830 1727204085.65913: running the handler 13830 1727204085.65973: variable '__network_connections_result' from source: set_fact 13830 1727204085.66062: variable '__network_connections_result' from source: set_fact 13830 1727204085.66312: handler run complete 13830 1727204085.66350: attempt loop complete, returning result 13830 1727204085.66356: _execute() done 13830 1727204085.66361: dumping result to json 13830 1727204085.66372: done dumping result, returning 13830 1727204085.66381: done running TaskExecutor() for managed-node3/TASK: Show result [0affcd87-79f5-1659-6b02-0000000001c6] 13830 1727204085.66389: sending task result for task 0affcd87-79f5-1659-6b02-0000000001c6 ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a426d8cc-5539-4594-a8cb-0bd7ae20a9f8 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 332b7a11-84b4-4fa6-9593-05efe3c41549 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1316a51f-04e4-4493-b9cd-1041de4c4b19 (not-active)" ] } } 13830 1727204085.66629: no more pending results, returning what we have 13830 1727204085.66635: results queue empty 13830 1727204085.66636: checking for any_errors_fatal 13830 1727204085.66638: done checking for any_errors_fatal 13830 1727204085.66639: checking for max_fail_percentage 13830 1727204085.66641: done checking for max_fail_percentage 13830 1727204085.66642: checking to see if all hosts have failed and the running result is not ok 13830 1727204085.66643: done checking to see if all hosts have failed 13830 1727204085.66643: getting the remaining hosts for this loop 13830 1727204085.66645: done getting the remaining hosts for this loop 13830 1727204085.66650: getting the next task for host managed-node3 13830 1727204085.66659: done getting next task for host managed-node3 13830 1727204085.66662: ^ task is: TASK: Asserts 13830 1727204085.66671: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204085.66677: getting variables 13830 1727204085.66678: in VariableManager get_vars() 13830 1727204085.66711: Calling all_inventory to load vars for managed-node3 13830 1727204085.66714: Calling groups_inventory to load vars for managed-node3 13830 1727204085.66718: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204085.66732: Calling all_plugins_play to load vars for managed-node3 13830 1727204085.66735: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204085.66738: Calling groups_plugins_play to load vars for managed-node3 13830 1727204085.67733: done sending task result for task 0affcd87-79f5-1659-6b02-0000000001c6 13830 1727204085.67737: WORKER PROCESS EXITING 13830 1727204085.68668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204085.70468: done with get_vars() 13830 1727204085.70502: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.083) 0:00:18.784 ***** 13830 1727204085.70615: entering _queue_task() for managed-node3/include_tasks 13830 1727204085.71041: worker is 1 (out of 1 available) 13830 1727204085.71056: exiting _queue_task() for managed-node3/include_tasks 13830 1727204085.71071: done queuing things up, now waiting for results queue to drain 13830 1727204085.71073: waiting for pending results... 13830 1727204085.71393: running TaskExecutor() for managed-node3/TASK: Asserts 13830 1727204085.71528: in run() - task 0affcd87-79f5-1659-6b02-00000000008d 13830 1727204085.71559: variable 'ansible_search_path' from source: unknown 13830 1727204085.71570: variable 'ansible_search_path' from source: unknown 13830 1727204085.71623: variable 'lsr_assert' from source: include params 13830 1727204085.71872: variable 'lsr_assert' from source: include params 13830 1727204085.71960: variable 'omit' from source: magic vars 13830 1727204085.72110: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.72125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.72146: variable 'omit' from source: magic vars 13830 1727204085.72405: variable 'ansible_distribution_major_version' from source: facts 13830 1727204085.72427: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204085.72440: variable 'item' from source: unknown 13830 1727204085.72516: variable 'item' from source: unknown 13830 1727204085.72571: variable 'item' from source: unknown 13830 1727204085.72645: variable 'item' from source: unknown 13830 1727204085.72916: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.72945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.72960: variable 'omit' from source: magic vars 13830 1727204085.73221: variable 'ansible_distribution_major_version' from source: facts 13830 1727204085.73235: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204085.73245: variable 'item' from source: unknown 13830 1727204085.73326: variable 'item' from source: unknown 13830 1727204085.73367: variable 'item' from source: unknown 13830 1727204085.73438: variable 'item' from source: unknown 13830 1727204085.73574: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.73587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.73602: variable 'omit' from source: magic vars 13830 1727204085.73775: variable 'ansible_distribution_major_version' from source: facts 13830 1727204085.73786: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204085.73794: variable 'item' from source: unknown 13830 1727204085.73868: variable 'item' from source: unknown 13830 1727204085.73902: variable 'item' from source: unknown 13830 1727204085.73976: variable 'item' from source: unknown 13830 1727204085.74067: dumping result to json 13830 1727204085.74078: done dumping result, returning 13830 1727204085.74089: done running TaskExecutor() for managed-node3/TASK: Asserts [0affcd87-79f5-1659-6b02-00000000008d] 13830 1727204085.74100: sending task result for task 0affcd87-79f5-1659-6b02-00000000008d 13830 1727204085.74195: no more pending results, returning what we have 13830 1727204085.74202: in VariableManager get_vars() 13830 1727204085.74246: Calling all_inventory to load vars for managed-node3 13830 1727204085.74249: Calling groups_inventory to load vars for managed-node3 13830 1727204085.74253: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204085.74269: Calling all_plugins_play to load vars for managed-node3 13830 1727204085.74273: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204085.74276: Calling groups_plugins_play to load vars for managed-node3 13830 1727204085.75323: done sending task result for task 0affcd87-79f5-1659-6b02-00000000008d 13830 1727204085.75327: WORKER PROCESS EXITING 13830 1727204085.76239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204085.77998: done with get_vars() 13830 1727204085.78027: variable 'ansible_search_path' from source: unknown 13830 1727204085.78029: variable 'ansible_search_path' from source: unknown 13830 1727204085.78083: variable 'ansible_search_path' from source: unknown 13830 1727204085.78084: variable 'ansible_search_path' from source: unknown 13830 1727204085.78117: variable 'ansible_search_path' from source: unknown 13830 1727204085.78119: variable 'ansible_search_path' from source: unknown 13830 1727204085.78151: we have included files to process 13830 1727204085.78152: generating all_blocks data 13830 1727204085.78154: done generating all_blocks data 13830 1727204085.78159: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 13830 1727204085.78160: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 13830 1727204085.78165: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 13830 1727204085.78339: in VariableManager get_vars() 13830 1727204085.78361: done with get_vars() 13830 1727204085.78371: variable 'item' from source: include params 13830 1727204085.78490: variable 'item' from source: include params 13830 1727204085.78525: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13830 1727204085.78621: in VariableManager get_vars() 13830 1727204085.78644: done with get_vars() 13830 1727204085.78784: done processing included file 13830 1727204085.78787: iterating over new_blocks loaded from include file 13830 1727204085.78788: in VariableManager get_vars() 13830 1727204085.78804: done with get_vars() 13830 1727204085.78806: filtering new block on tags 13830 1727204085.78869: done filtering new block on tags 13830 1727204085.78872: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml for managed-node3 => (item=tasks/assert_controller_device_present.yml) 13830 1727204085.78878: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 13830 1727204085.78879: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 13830 1727204085.78883: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 13830 1727204085.79026: in VariableManager get_vars() 13830 1727204085.79052: done with get_vars() 13830 1727204085.79068: done processing included file 13830 1727204085.79070: iterating over new_blocks loaded from include file 13830 1727204085.79071: in VariableManager get_vars() 13830 1727204085.79084: done with get_vars() 13830 1727204085.79086: filtering new block on tags 13830 1727204085.79109: done filtering new block on tags 13830 1727204085.79111: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml for managed-node3 => (item=tasks/assert_bond_port_profile_present.yml) 13830 1727204085.79115: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 13830 1727204085.79116: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 13830 1727204085.79125: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 13830 1727204085.79495: in VariableManager get_vars() 13830 1727204085.79513: done with get_vars() 13830 1727204085.79557: in VariableManager get_vars() 13830 1727204085.79576: done with get_vars() 13830 1727204085.79593: done processing included file 13830 1727204085.79595: iterating over new_blocks loaded from include file 13830 1727204085.79596: in VariableManager get_vars() 13830 1727204085.79610: done with get_vars() 13830 1727204085.79611: filtering new block on tags 13830 1727204085.79656: done filtering new block on tags 13830 1727204085.79658: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed-node3 => (item=tasks/assert_bond_options.yml) 13830 1727204085.79662: extending task lists for all hosts with included blocks 13830 1727204085.81354: done extending task lists 13830 1727204085.81355: done processing included files 13830 1727204085.81356: results queue empty 13830 1727204085.81357: checking for any_errors_fatal 13830 1727204085.81363: done checking for any_errors_fatal 13830 1727204085.81367: checking for max_fail_percentage 13830 1727204085.81368: done checking for max_fail_percentage 13830 1727204085.81369: checking to see if all hosts have failed and the running result is not ok 13830 1727204085.81370: done checking to see if all hosts have failed 13830 1727204085.81370: getting the remaining hosts for this loop 13830 1727204085.81372: done getting the remaining hosts for this loop 13830 1727204085.81375: getting the next task for host managed-node3 13830 1727204085.81380: done getting next task for host managed-node3 13830 1727204085.81383: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13830 1727204085.81387: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204085.81389: getting variables 13830 1727204085.81390: in VariableManager get_vars() 13830 1727204085.81401: Calling all_inventory to load vars for managed-node3 13830 1727204085.81404: Calling groups_inventory to load vars for managed-node3 13830 1727204085.81406: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204085.81413: Calling all_plugins_play to load vars for managed-node3 13830 1727204085.81415: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204085.81419: Calling groups_plugins_play to load vars for managed-node3 13830 1727204085.82754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204085.84600: done with get_vars() 13830 1727204085.84623: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.141) 0:00:18.925 ***** 13830 1727204085.84723: entering _queue_task() for managed-node3/include_tasks 13830 1727204085.85099: worker is 1 (out of 1 available) 13830 1727204085.85114: exiting _queue_task() for managed-node3/include_tasks 13830 1727204085.85127: done queuing things up, now waiting for results queue to drain 13830 1727204085.85129: waiting for pending results... 13830 1727204085.85426: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 13830 1727204085.85604: in run() - task 0affcd87-79f5-1659-6b02-0000000003f5 13830 1727204085.85626: variable 'ansible_search_path' from source: unknown 13830 1727204085.85639: variable 'ansible_search_path' from source: unknown 13830 1727204085.85692: calling self._execute() 13830 1727204085.85793: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204085.85805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204085.85821: variable 'omit' from source: magic vars 13830 1727204085.86141: variable 'ansible_distribution_major_version' from source: facts 13830 1727204085.86157: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204085.86160: _execute() done 13830 1727204085.86163: dumping result to json 13830 1727204085.86170: done dumping result, returning 13830 1727204085.86175: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1659-6b02-0000000003f5] 13830 1727204085.86181: sending task result for task 0affcd87-79f5-1659-6b02-0000000003f5 13830 1727204085.86270: done sending task result for task 0affcd87-79f5-1659-6b02-0000000003f5 13830 1727204085.86274: WORKER PROCESS EXITING 13830 1727204085.86301: no more pending results, returning what we have 13830 1727204085.86306: in VariableManager get_vars() 13830 1727204085.86344: Calling all_inventory to load vars for managed-node3 13830 1727204085.86347: Calling groups_inventory to load vars for managed-node3 13830 1727204085.86350: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204085.86365: Calling all_plugins_play to load vars for managed-node3 13830 1727204085.86368: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204085.86371: Calling groups_plugins_play to load vars for managed-node3 13830 1727204085.87193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204085.88443: done with get_vars() 13830 1727204085.88473: variable 'ansible_search_path' from source: unknown 13830 1727204085.88476: variable 'ansible_search_path' from source: unknown 13830 1727204085.88515: we have included files to process 13830 1727204085.88517: generating all_blocks data 13830 1727204085.88519: done generating all_blocks data 13830 1727204085.88520: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204085.88521: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204085.88523: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204085.88780: done processing included file 13830 1727204085.88782: iterating over new_blocks loaded from include file 13830 1727204085.88784: in VariableManager get_vars() 13830 1727204085.88804: done with get_vars() 13830 1727204085.88806: filtering new block on tags 13830 1727204085.88844: done filtering new block on tags 13830 1727204085.88847: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 13830 1727204085.88853: extending task lists for all hosts with included blocks 13830 1727204085.89083: done extending task lists 13830 1727204085.89084: done processing included files 13830 1727204085.89085: results queue empty 13830 1727204085.89086: checking for any_errors_fatal 13830 1727204085.89090: done checking for any_errors_fatal 13830 1727204085.89125: checking for max_fail_percentage 13830 1727204085.89128: done checking for max_fail_percentage 13830 1727204085.89129: checking to see if all hosts have failed and the running result is not ok 13830 1727204085.89130: done checking to see if all hosts have failed 13830 1727204085.89131: getting the remaining hosts for this loop 13830 1727204085.89132: done getting the remaining hosts for this loop 13830 1727204085.89141: getting the next task for host managed-node3 13830 1727204085.89146: done getting next task for host managed-node3 13830 1727204085.89148: ^ task is: TASK: Get stat for interface {{ interface }} 13830 1727204085.89152: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204085.89155: getting variables 13830 1727204085.89156: in VariableManager get_vars() 13830 1727204085.89170: Calling all_inventory to load vars for managed-node3 13830 1727204085.89173: Calling groups_inventory to load vars for managed-node3 13830 1727204085.89175: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204085.89184: Calling all_plugins_play to load vars for managed-node3 13830 1727204085.89226: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204085.89233: Calling groups_plugins_play to load vars for managed-node3 13830 1727204085.90925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.01677: done with get_vars() 13830 1727204086.01709: done getting variables 13830 1727204086.01873: variable 'interface' from source: task vars 13830 1727204086.01881: variable 'controller_device' from source: play vars 13830 1727204086.01944: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.172) 0:00:19.097 ***** 13830 1727204086.01979: entering _queue_task() for managed-node3/stat 13830 1727204086.02324: worker is 1 (out of 1 available) 13830 1727204086.02337: exiting _queue_task() for managed-node3/stat 13830 1727204086.02349: done queuing things up, now waiting for results queue to drain 13830 1727204086.02350: waiting for pending results... 13830 1727204086.02651: running TaskExecutor() for managed-node3/TASK: Get stat for interface nm-bond 13830 1727204086.02844: in run() - task 0affcd87-79f5-1659-6b02-0000000004af 13830 1727204086.02863: variable 'ansible_search_path' from source: unknown 13830 1727204086.02876: variable 'ansible_search_path' from source: unknown 13830 1727204086.02917: calling self._execute() 13830 1727204086.03017: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.03029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.03042: variable 'omit' from source: magic vars 13830 1727204086.03576: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.03597: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.03611: variable 'omit' from source: magic vars 13830 1727204086.03706: variable 'omit' from source: magic vars 13830 1727204086.03866: variable 'interface' from source: task vars 13830 1727204086.03877: variable 'controller_device' from source: play vars 13830 1727204086.03947: variable 'controller_device' from source: play vars 13830 1727204086.04005: variable 'omit' from source: magic vars 13830 1727204086.04062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204086.04112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204086.04147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204086.04173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.04193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.04232: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204086.04240: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.04247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.04353: Set connection var ansible_connection to ssh 13830 1727204086.04371: Set connection var ansible_timeout to 10 13830 1727204086.04380: Set connection var ansible_shell_executable to /bin/sh 13830 1727204086.04386: Set connection var ansible_shell_type to sh 13830 1727204086.04399: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204086.04415: Set connection var ansible_pipelining to False 13830 1727204086.04446: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.04453: variable 'ansible_connection' from source: unknown 13830 1727204086.04460: variable 'ansible_module_compression' from source: unknown 13830 1727204086.04468: variable 'ansible_shell_type' from source: unknown 13830 1727204086.04475: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.04480: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.04487: variable 'ansible_pipelining' from source: unknown 13830 1727204086.04494: variable 'ansible_timeout' from source: unknown 13830 1727204086.04502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.04742: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204086.04761: variable 'omit' from source: magic vars 13830 1727204086.04773: starting attempt loop 13830 1727204086.04779: running the handler 13830 1727204086.04795: _low_level_execute_command(): starting 13830 1727204086.04807: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204086.05609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.05629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.05645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.05669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.05760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.05774: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.05787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.05804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.05815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.05826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.05867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.05895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.05911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.05925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.05936: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.05957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.06038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.06066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.06088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.06167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.07809: stdout chunk (state=3): >>>/root <<< 13830 1727204086.07984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204086.08018: stderr chunk (state=3): >>><<< 13830 1727204086.08021: stdout chunk (state=3): >>><<< 13830 1727204086.08135: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204086.08139: _low_level_execute_command(): starting 13830 1727204086.08141: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619 `" && echo ansible-tmp-1727204086.0804188-15179-129780431840619="` echo /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619 `" ) && sleep 0' 13830 1727204086.08740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.08749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.08761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.08777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.08817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.08823: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.08835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.08848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.08856: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.08862: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.08874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.08884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.08895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.08903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.08908: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.08917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.08994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.09009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.09018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.09094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.10976: stdout chunk (state=3): >>>ansible-tmp-1727204086.0804188-15179-129780431840619=/root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619 <<< 13830 1727204086.11088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204086.11204: stderr chunk (state=3): >>><<< 13830 1727204086.11207: stdout chunk (state=3): >>><<< 13830 1727204086.11234: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204086.0804188-15179-129780431840619=/root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204086.11294: variable 'ansible_module_compression' from source: unknown 13830 1727204086.11359: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204086.11402: variable 'ansible_facts' from source: unknown 13830 1727204086.11472: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619/AnsiballZ_stat.py 13830 1727204086.11623: Sending initial data 13830 1727204086.11626: Sent initial data (153 bytes) 13830 1727204086.12672: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.12675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.12678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.12680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.12683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.12686: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.12802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.12805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.12808: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.12811: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.12813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.12815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.12817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.12819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.12821: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.12823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.12839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.12853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.12865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.13054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.14641: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204086.14674: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204086.14714: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp6fwpadwx /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619/AnsiballZ_stat.py <<< 13830 1727204086.14745: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204086.15849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204086.15948: stderr chunk (state=3): >>><<< 13830 1727204086.15951: stdout chunk (state=3): >>><<< 13830 1727204086.15980: done transferring module to remote 13830 1727204086.15990: _low_level_execute_command(): starting 13830 1727204086.15997: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619/ /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619/AnsiballZ_stat.py && sleep 0' 13830 1727204086.16779: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.16788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.16800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.16817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.16871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.16886: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.16896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.16910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.16918: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.16925: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.16935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.16942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.16960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.16970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.16977: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.16986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.17059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.17086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.17105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.17567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.19008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204086.19012: stdout chunk (state=3): >>><<< 13830 1727204086.19014: stderr chunk (state=3): >>><<< 13830 1727204086.19122: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204086.19127: _low_level_execute_command(): starting 13830 1727204086.19130: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619/AnsiballZ_stat.py && sleep 0' 13830 1727204086.19750: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.19766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.19783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.19808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.19851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.19865: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.19881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.19905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.19917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.19929: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.19942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.19958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.19977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.19990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.20001: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.20023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.20095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.20117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.20136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.20212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.33403: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27547, "dev": 21, "nlink": 1, "atime": 1727204084.2969408, "mtime": 1727204084.2969408, "ctime": 1727204084.2969408, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204086.34475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204086.34480: stdout chunk (state=3): >>><<< 13830 1727204086.34482: stderr chunk (state=3): >>><<< 13830 1727204086.34658: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27547, "dev": 21, "nlink": 1, "atime": 1727204084.2969408, "mtime": 1727204084.2969408, "ctime": 1727204084.2969408, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204086.34672: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204086.34676: _low_level_execute_command(): starting 13830 1727204086.34678: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204086.0804188-15179-129780431840619/ > /dev/null 2>&1 && sleep 0' 13830 1727204086.35535: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.35540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.35566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13830 1727204086.35587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.35679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.35682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.35754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.37600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204086.37604: stdout chunk (state=3): >>><<< 13830 1727204086.37606: stderr chunk (state=3): >>><<< 13830 1727204086.37977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204086.37981: handler run complete 13830 1727204086.37984: attempt loop complete, returning result 13830 1727204086.37987: _execute() done 13830 1727204086.37989: dumping result to json 13830 1727204086.37991: done dumping result, returning 13830 1727204086.37993: done running TaskExecutor() for managed-node3/TASK: Get stat for interface nm-bond [0affcd87-79f5-1659-6b02-0000000004af] 13830 1727204086.37995: sending task result for task 0affcd87-79f5-1659-6b02-0000000004af ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204084.2969408, "block_size": 4096, "blocks": 0, "ctime": 1727204084.2969408, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27547, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727204084.2969408, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13830 1727204086.38169: no more pending results, returning what we have 13830 1727204086.38174: results queue empty 13830 1727204086.38175: checking for any_errors_fatal 13830 1727204086.38177: done checking for any_errors_fatal 13830 1727204086.38177: checking for max_fail_percentage 13830 1727204086.38179: done checking for max_fail_percentage 13830 1727204086.38180: checking to see if all hosts have failed and the running result is not ok 13830 1727204086.38181: done checking to see if all hosts have failed 13830 1727204086.38181: getting the remaining hosts for this loop 13830 1727204086.38183: done getting the remaining hosts for this loop 13830 1727204086.38187: getting the next task for host managed-node3 13830 1727204086.38196: done getting next task for host managed-node3 13830 1727204086.38199: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13830 1727204086.38205: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204086.38209: getting variables 13830 1727204086.38210: in VariableManager get_vars() 13830 1727204086.38245: Calling all_inventory to load vars for managed-node3 13830 1727204086.38248: Calling groups_inventory to load vars for managed-node3 13830 1727204086.38253: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204086.38266: Calling all_plugins_play to load vars for managed-node3 13830 1727204086.38269: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204086.38279: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004af 13830 1727204086.38283: WORKER PROCESS EXITING 13830 1727204086.38614: Calling groups_plugins_play to load vars for managed-node3 13830 1727204086.42081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.45207: done with get_vars() 13830 1727204086.45239: done getting variables 13830 1727204086.45300: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204086.45424: variable 'interface' from source: task vars 13830 1727204086.45428: variable 'controller_device' from source: play vars 13830 1727204086.45489: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.435) 0:00:19.533 ***** 13830 1727204086.45524: entering _queue_task() for managed-node3/assert 13830 1727204086.46159: worker is 1 (out of 1 available) 13830 1727204086.46174: exiting _queue_task() for managed-node3/assert 13830 1727204086.46185: done queuing things up, now waiting for results queue to drain 13830 1727204086.46186: waiting for pending results... 13830 1727204086.46483: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' 13830 1727204086.46632: in run() - task 0affcd87-79f5-1659-6b02-0000000003f6 13830 1727204086.46652: variable 'ansible_search_path' from source: unknown 13830 1727204086.46657: variable 'ansible_search_path' from source: unknown 13830 1727204086.46693: calling self._execute() 13830 1727204086.46786: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.46790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.46801: variable 'omit' from source: magic vars 13830 1727204086.47206: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.47219: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.47224: variable 'omit' from source: magic vars 13830 1727204086.47298: variable 'omit' from source: magic vars 13830 1727204086.47398: variable 'interface' from source: task vars 13830 1727204086.47408: variable 'controller_device' from source: play vars 13830 1727204086.47468: variable 'controller_device' from source: play vars 13830 1727204086.47489: variable 'omit' from source: magic vars 13830 1727204086.47543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204086.47578: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204086.47600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204086.47624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.47637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.47665: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204086.47669: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.47671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.47770: Set connection var ansible_connection to ssh 13830 1727204086.47780: Set connection var ansible_timeout to 10 13830 1727204086.47786: Set connection var ansible_shell_executable to /bin/sh 13830 1727204086.47789: Set connection var ansible_shell_type to sh 13830 1727204086.47794: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204086.47803: Set connection var ansible_pipelining to False 13830 1727204086.47839: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.47842: variable 'ansible_connection' from source: unknown 13830 1727204086.47845: variable 'ansible_module_compression' from source: unknown 13830 1727204086.47847: variable 'ansible_shell_type' from source: unknown 13830 1727204086.47849: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.47851: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.47855: variable 'ansible_pipelining' from source: unknown 13830 1727204086.47858: variable 'ansible_timeout' from source: unknown 13830 1727204086.47862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.48017: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204086.48028: variable 'omit' from source: magic vars 13830 1727204086.48035: starting attempt loop 13830 1727204086.48042: running the handler 13830 1727204086.48182: variable 'interface_stat' from source: set_fact 13830 1727204086.48202: Evaluated conditional (interface_stat.stat.exists): True 13830 1727204086.48207: handler run complete 13830 1727204086.48222: attempt loop complete, returning result 13830 1727204086.48225: _execute() done 13830 1727204086.48228: dumping result to json 13830 1727204086.48233: done dumping result, returning 13830 1727204086.48236: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' [0affcd87-79f5-1659-6b02-0000000003f6] 13830 1727204086.48242: sending task result for task 0affcd87-79f5-1659-6b02-0000000003f6 13830 1727204086.48346: done sending task result for task 0affcd87-79f5-1659-6b02-0000000003f6 13830 1727204086.48348: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204086.48427: no more pending results, returning what we have 13830 1727204086.48431: results queue empty 13830 1727204086.48432: checking for any_errors_fatal 13830 1727204086.48441: done checking for any_errors_fatal 13830 1727204086.48442: checking for max_fail_percentage 13830 1727204086.48445: done checking for max_fail_percentage 13830 1727204086.48446: checking to see if all hosts have failed and the running result is not ok 13830 1727204086.48447: done checking to see if all hosts have failed 13830 1727204086.48447: getting the remaining hosts for this loop 13830 1727204086.48450: done getting the remaining hosts for this loop 13830 1727204086.48455: getting the next task for host managed-node3 13830 1727204086.48469: done getting next task for host managed-node3 13830 1727204086.48472: ^ task is: TASK: Include the task 'assert_profile_present.yml' 13830 1727204086.48478: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204086.48483: getting variables 13830 1727204086.48485: in VariableManager get_vars() 13830 1727204086.48521: Calling all_inventory to load vars for managed-node3 13830 1727204086.48524: Calling groups_inventory to load vars for managed-node3 13830 1727204086.48529: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204086.48541: Calling all_plugins_play to load vars for managed-node3 13830 1727204086.48544: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204086.48547: Calling groups_plugins_play to load vars for managed-node3 13830 1727204086.51428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.54506: done with get_vars() 13830 1727204086.54536: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml:3 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.091) 0:00:19.624 ***** 13830 1727204086.54638: entering _queue_task() for managed-node3/include_tasks 13830 1727204086.54986: worker is 1 (out of 1 available) 13830 1727204086.54999: exiting _queue_task() for managed-node3/include_tasks 13830 1727204086.55010: done queuing things up, now waiting for results queue to drain 13830 1727204086.55012: waiting for pending results... 13830 1727204086.55306: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 13830 1727204086.55437: in run() - task 0affcd87-79f5-1659-6b02-0000000003fb 13830 1727204086.55447: variable 'ansible_search_path' from source: unknown 13830 1727204086.55449: variable 'ansible_search_path' from source: unknown 13830 1727204086.55559: variable 'controller_profile' from source: play vars 13830 1727204086.56432: variable 'controller_profile' from source: play vars 13830 1727204086.56443: variable 'port1_profile' from source: play vars 13830 1727204086.56518: variable 'port1_profile' from source: play vars 13830 1727204086.56524: variable 'port2_profile' from source: play vars 13830 1727204086.56591: variable 'port2_profile' from source: play vars 13830 1727204086.56603: variable 'omit' from source: magic vars 13830 1727204086.56747: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.56756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.56769: variable 'omit' from source: magic vars 13830 1727204086.57010: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.57020: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.57059: variable 'bond_port_profile' from source: unknown 13830 1727204086.57122: variable 'bond_port_profile' from source: unknown 13830 1727204086.57265: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.57269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.57273: variable 'omit' from source: magic vars 13830 1727204086.57392: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.57398: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.57432: variable 'bond_port_profile' from source: unknown 13830 1727204086.57490: variable 'bond_port_profile' from source: unknown 13830 1727204086.57570: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.57573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.57576: variable 'omit' from source: magic vars 13830 1727204086.57729: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.57735: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.57766: variable 'bond_port_profile' from source: unknown 13830 1727204086.57829: variable 'bond_port_profile' from source: unknown 13830 1727204086.57898: dumping result to json 13830 1727204086.57901: done dumping result, returning 13830 1727204086.57904: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [0affcd87-79f5-1659-6b02-0000000003fb] 13830 1727204086.57906: sending task result for task 0affcd87-79f5-1659-6b02-0000000003fb 13830 1727204086.57947: done sending task result for task 0affcd87-79f5-1659-6b02-0000000003fb 13830 1727204086.57951: WORKER PROCESS EXITING 13830 1727204086.57984: no more pending results, returning what we have 13830 1727204086.57989: in VariableManager get_vars() 13830 1727204086.58030: Calling all_inventory to load vars for managed-node3 13830 1727204086.58033: Calling groups_inventory to load vars for managed-node3 13830 1727204086.58037: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204086.58051: Calling all_plugins_play to load vars for managed-node3 13830 1727204086.58054: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204086.58057: Calling groups_plugins_play to load vars for managed-node3 13830 1727204086.59645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.61417: done with get_vars() 13830 1727204086.61436: variable 'ansible_search_path' from source: unknown 13830 1727204086.61438: variable 'ansible_search_path' from source: unknown 13830 1727204086.61447: variable 'item' from source: include params 13830 1727204086.61854: variable 'item' from source: include params 13830 1727204086.61892: variable 'ansible_search_path' from source: unknown 13830 1727204086.61893: variable 'ansible_search_path' from source: unknown 13830 1727204086.61899: variable 'item' from source: include params 13830 1727204086.61956: variable 'item' from source: include params 13830 1727204086.61986: variable 'ansible_search_path' from source: unknown 13830 1727204086.61987: variable 'ansible_search_path' from source: unknown 13830 1727204086.61992: variable 'item' from source: include params 13830 1727204086.62044: variable 'item' from source: include params 13830 1727204086.62075: we have included files to process 13830 1727204086.62077: generating all_blocks data 13830 1727204086.62079: done generating all_blocks data 13830 1727204086.62083: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.62084: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.62086: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.62285: in VariableManager get_vars() 13830 1727204086.62308: done with get_vars() 13830 1727204086.63316: done processing included file 13830 1727204086.63318: iterating over new_blocks loaded from include file 13830 1727204086.63320: in VariableManager get_vars() 13830 1727204086.63337: done with get_vars() 13830 1727204086.63339: filtering new block on tags 13830 1727204086.63403: done filtering new block on tags 13830 1727204086.63406: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0) 13830 1727204086.63411: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.63412: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.63415: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.63523: in VariableManager get_vars() 13830 1727204086.63544: done with get_vars() 13830 1727204086.64379: done processing included file 13830 1727204086.64381: iterating over new_blocks loaded from include file 13830 1727204086.64382: in VariableManager get_vars() 13830 1727204086.64396: done with get_vars() 13830 1727204086.64397: filtering new block on tags 13830 1727204086.64452: done filtering new block on tags 13830 1727204086.64455: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0.0) 13830 1727204086.64460: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.64461: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.64467: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13830 1727204086.64576: in VariableManager get_vars() 13830 1727204086.64593: done with get_vars() 13830 1727204086.64807: done processing included file 13830 1727204086.64810: iterating over new_blocks loaded from include file 13830 1727204086.64811: in VariableManager get_vars() 13830 1727204086.64824: done with get_vars() 13830 1727204086.64825: filtering new block on tags 13830 1727204086.64876: done filtering new block on tags 13830 1727204086.64879: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0.1) 13830 1727204086.64883: extending task lists for all hosts with included blocks 13830 1727204086.65658: done extending task lists 13830 1727204086.65659: done processing included files 13830 1727204086.65660: results queue empty 13830 1727204086.65661: checking for any_errors_fatal 13830 1727204086.65668: done checking for any_errors_fatal 13830 1727204086.65668: checking for max_fail_percentage 13830 1727204086.65670: done checking for max_fail_percentage 13830 1727204086.65670: checking to see if all hosts have failed and the running result is not ok 13830 1727204086.65671: done checking to see if all hosts have failed 13830 1727204086.65672: getting the remaining hosts for this loop 13830 1727204086.65673: done getting the remaining hosts for this loop 13830 1727204086.65676: getting the next task for host managed-node3 13830 1727204086.65681: done getting next task for host managed-node3 13830 1727204086.65683: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13830 1727204086.65686: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204086.65688: getting variables 13830 1727204086.65689: in VariableManager get_vars() 13830 1727204086.65699: Calling all_inventory to load vars for managed-node3 13830 1727204086.65701: Calling groups_inventory to load vars for managed-node3 13830 1727204086.65703: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204086.65709: Calling all_plugins_play to load vars for managed-node3 13830 1727204086.65711: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204086.65714: Calling groups_plugins_play to load vars for managed-node3 13830 1727204086.66989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.68775: done with get_vars() 13830 1727204086.68801: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.142) 0:00:19.766 ***** 13830 1727204086.68876: entering _queue_task() for managed-node3/include_tasks 13830 1727204086.69227: worker is 1 (out of 1 available) 13830 1727204086.69241: exiting _queue_task() for managed-node3/include_tasks 13830 1727204086.69254: done queuing things up, now waiting for results queue to drain 13830 1727204086.69255: waiting for pending results... 13830 1727204086.69788: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 13830 1727204086.69902: in run() - task 0affcd87-79f5-1659-6b02-0000000004d9 13830 1727204086.69916: variable 'ansible_search_path' from source: unknown 13830 1727204086.69920: variable 'ansible_search_path' from source: unknown 13830 1727204086.69958: calling self._execute() 13830 1727204086.70060: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.70065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.70077: variable 'omit' from source: magic vars 13830 1727204086.70499: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.70511: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.70521: _execute() done 13830 1727204086.70524: dumping result to json 13830 1727204086.70532: done dumping result, returning 13830 1727204086.70535: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1659-6b02-0000000004d9] 13830 1727204086.70543: sending task result for task 0affcd87-79f5-1659-6b02-0000000004d9 13830 1727204086.70642: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004d9 13830 1727204086.70645: WORKER PROCESS EXITING 13830 1727204086.70680: no more pending results, returning what we have 13830 1727204086.70685: in VariableManager get_vars() 13830 1727204086.70729: Calling all_inventory to load vars for managed-node3 13830 1727204086.70732: Calling groups_inventory to load vars for managed-node3 13830 1727204086.70736: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204086.70751: Calling all_plugins_play to load vars for managed-node3 13830 1727204086.70754: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204086.70757: Calling groups_plugins_play to load vars for managed-node3 13830 1727204086.72620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.74347: done with get_vars() 13830 1727204086.74377: variable 'ansible_search_path' from source: unknown 13830 1727204086.74379: variable 'ansible_search_path' from source: unknown 13830 1727204086.74422: we have included files to process 13830 1727204086.74423: generating all_blocks data 13830 1727204086.74424: done generating all_blocks data 13830 1727204086.74426: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204086.74427: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204086.74429: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204086.75989: done processing included file 13830 1727204086.75991: iterating over new_blocks loaded from include file 13830 1727204086.75993: in VariableManager get_vars() 13830 1727204086.76013: done with get_vars() 13830 1727204086.76015: filtering new block on tags 13830 1727204086.76176: done filtering new block on tags 13830 1727204086.76181: in VariableManager get_vars() 13830 1727204086.76198: done with get_vars() 13830 1727204086.76199: filtering new block on tags 13830 1727204086.76262: done filtering new block on tags 13830 1727204086.76267: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 13830 1727204086.76273: extending task lists for all hosts with included blocks 13830 1727204086.76598: done extending task lists 13830 1727204086.76599: done processing included files 13830 1727204086.76600: results queue empty 13830 1727204086.76601: checking for any_errors_fatal 13830 1727204086.76604: done checking for any_errors_fatal 13830 1727204086.76605: checking for max_fail_percentage 13830 1727204086.76606: done checking for max_fail_percentage 13830 1727204086.76607: checking to see if all hosts have failed and the running result is not ok 13830 1727204086.76607: done checking to see if all hosts have failed 13830 1727204086.76608: getting the remaining hosts for this loop 13830 1727204086.76609: done getting the remaining hosts for this loop 13830 1727204086.76612: getting the next task for host managed-node3 13830 1727204086.76617: done getting next task for host managed-node3 13830 1727204086.76618: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13830 1727204086.76622: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204086.76624: getting variables 13830 1727204086.76625: in VariableManager get_vars() 13830 1727204086.76635: Calling all_inventory to load vars for managed-node3 13830 1727204086.76637: Calling groups_inventory to load vars for managed-node3 13830 1727204086.76639: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204086.76645: Calling all_plugins_play to load vars for managed-node3 13830 1727204086.76647: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204086.76650: Calling groups_plugins_play to load vars for managed-node3 13830 1727204086.78500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.80655: done with get_vars() 13830 1727204086.80682: done getting variables 13830 1727204086.80729: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.118) 0:00:19.885 ***** 13830 1727204086.80766: entering _queue_task() for managed-node3/set_fact 13830 1727204086.81103: worker is 1 (out of 1 available) 13830 1727204086.81115: exiting _queue_task() for managed-node3/set_fact 13830 1727204086.81127: done queuing things up, now waiting for results queue to drain 13830 1727204086.81129: waiting for pending results... 13830 1727204086.81422: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13830 1727204086.81571: in run() - task 0affcd87-79f5-1659-6b02-0000000004fc 13830 1727204086.81908: variable 'ansible_search_path' from source: unknown 13830 1727204086.81912: variable 'ansible_search_path' from source: unknown 13830 1727204086.81915: calling self._execute() 13830 1727204086.81918: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.81920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.81923: variable 'omit' from source: magic vars 13830 1727204086.82205: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.82209: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.82212: variable 'omit' from source: magic vars 13830 1727204086.82214: variable 'omit' from source: magic vars 13830 1727204086.82224: variable 'omit' from source: magic vars 13830 1727204086.82590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204086.82595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204086.82597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204086.82600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.82602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.82605: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204086.82608: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.82610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.82612: Set connection var ansible_connection to ssh 13830 1727204086.82615: Set connection var ansible_timeout to 10 13830 1727204086.82617: Set connection var ansible_shell_executable to /bin/sh 13830 1727204086.82619: Set connection var ansible_shell_type to sh 13830 1727204086.82628: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204086.82632: Set connection var ansible_pipelining to False 13830 1727204086.82635: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.82637: variable 'ansible_connection' from source: unknown 13830 1727204086.82640: variable 'ansible_module_compression' from source: unknown 13830 1727204086.82642: variable 'ansible_shell_type' from source: unknown 13830 1727204086.82644: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.82646: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.82648: variable 'ansible_pipelining' from source: unknown 13830 1727204086.82650: variable 'ansible_timeout' from source: unknown 13830 1727204086.82652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.82727: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204086.82742: variable 'omit' from source: magic vars 13830 1727204086.82745: starting attempt loop 13830 1727204086.82748: running the handler 13830 1727204086.82771: handler run complete 13830 1727204086.82776: attempt loop complete, returning result 13830 1727204086.82809: _execute() done 13830 1727204086.82813: dumping result to json 13830 1727204086.82815: done dumping result, returning 13830 1727204086.82817: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1659-6b02-0000000004fc] 13830 1727204086.82819: sending task result for task 0affcd87-79f5-1659-6b02-0000000004fc 13830 1727204086.82917: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004fc 13830 1727204086.82920: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13830 1727204086.83011: no more pending results, returning what we have 13830 1727204086.83014: results queue empty 13830 1727204086.83015: checking for any_errors_fatal 13830 1727204086.83018: done checking for any_errors_fatal 13830 1727204086.83018: checking for max_fail_percentage 13830 1727204086.83020: done checking for max_fail_percentage 13830 1727204086.83021: checking to see if all hosts have failed and the running result is not ok 13830 1727204086.83022: done checking to see if all hosts have failed 13830 1727204086.83023: getting the remaining hosts for this loop 13830 1727204086.83025: done getting the remaining hosts for this loop 13830 1727204086.83029: getting the next task for host managed-node3 13830 1727204086.83038: done getting next task for host managed-node3 13830 1727204086.83040: ^ task is: TASK: Stat profile file 13830 1727204086.83048: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204086.83053: getting variables 13830 1727204086.83055: in VariableManager get_vars() 13830 1727204086.83091: Calling all_inventory to load vars for managed-node3 13830 1727204086.83094: Calling groups_inventory to load vars for managed-node3 13830 1727204086.83098: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204086.83110: Calling all_plugins_play to load vars for managed-node3 13830 1727204086.83113: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204086.83116: Calling groups_plugins_play to load vars for managed-node3 13830 1727204086.84925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204086.86740: done with get_vars() 13830 1727204086.86773: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.061) 0:00:19.946 ***** 13830 1727204086.86889: entering _queue_task() for managed-node3/stat 13830 1727204086.87384: worker is 1 (out of 1 available) 13830 1727204086.87397: exiting _queue_task() for managed-node3/stat 13830 1727204086.87409: done queuing things up, now waiting for results queue to drain 13830 1727204086.87411: waiting for pending results... 13830 1727204086.87700: running TaskExecutor() for managed-node3/TASK: Stat profile file 13830 1727204086.87839: in run() - task 0affcd87-79f5-1659-6b02-0000000004fd 13830 1727204086.87860: variable 'ansible_search_path' from source: unknown 13830 1727204086.87865: variable 'ansible_search_path' from source: unknown 13830 1727204086.87900: calling self._execute() 13830 1727204086.87995: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.87998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.88010: variable 'omit' from source: magic vars 13830 1727204086.88379: variable 'ansible_distribution_major_version' from source: facts 13830 1727204086.88393: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204086.88404: variable 'omit' from source: magic vars 13830 1727204086.88468: variable 'omit' from source: magic vars 13830 1727204086.88573: variable 'profile' from source: include params 13830 1727204086.88576: variable 'bond_port_profile' from source: include params 13830 1727204086.88647: variable 'bond_port_profile' from source: include params 13830 1727204086.88667: variable 'omit' from source: magic vars 13830 1727204086.88710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204086.88749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204086.88771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204086.88788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.88799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204086.88829: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204086.88836: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.88839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.88943: Set connection var ansible_connection to ssh 13830 1727204086.88957: Set connection var ansible_timeout to 10 13830 1727204086.88966: Set connection var ansible_shell_executable to /bin/sh 13830 1727204086.88970: Set connection var ansible_shell_type to sh 13830 1727204086.88974: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204086.88984: Set connection var ansible_pipelining to False 13830 1727204086.89007: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.89011: variable 'ansible_connection' from source: unknown 13830 1727204086.89013: variable 'ansible_module_compression' from source: unknown 13830 1727204086.89015: variable 'ansible_shell_type' from source: unknown 13830 1727204086.89018: variable 'ansible_shell_executable' from source: unknown 13830 1727204086.89020: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204086.89025: variable 'ansible_pipelining' from source: unknown 13830 1727204086.89028: variable 'ansible_timeout' from source: unknown 13830 1727204086.89035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204086.89319: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204086.89329: variable 'omit' from source: magic vars 13830 1727204086.89335: starting attempt loop 13830 1727204086.89338: running the handler 13830 1727204086.89351: _low_level_execute_command(): starting 13830 1727204086.89359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204086.90181: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.90194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.90205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.90219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.90261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.90267: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.90281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.90295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.90303: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.90310: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.90318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.90328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.90339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.90347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.90354: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.90364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.90442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.90458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.90461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.90545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.92176: stdout chunk (state=3): >>>/root <<< 13830 1727204086.92277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204086.92384: stderr chunk (state=3): >>><<< 13830 1727204086.92387: stdout chunk (state=3): >>><<< 13830 1727204086.92414: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204086.92429: _low_level_execute_command(): starting 13830 1727204086.92435: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480 `" && echo ansible-tmp-1727204086.924138-15212-192906650614480="` echo /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480 `" ) && sleep 0' 13830 1727204086.93066: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.93079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.93091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.93108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.93143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.93150: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.93160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.93177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.93187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.93190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.93197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.93207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.93220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.93227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.93235: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.93246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.93314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.93342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.93345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.93418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.95235: stdout chunk (state=3): >>>ansible-tmp-1727204086.924138-15212-192906650614480=/root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480 <<< 13830 1727204086.95433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204086.95438: stdout chunk (state=3): >>><<< 13830 1727204086.95442: stderr chunk (state=3): >>><<< 13830 1727204086.95571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204086.924138-15212-192906650614480=/root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204086.95575: variable 'ansible_module_compression' from source: unknown 13830 1727204086.95748: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204086.95751: variable 'ansible_facts' from source: unknown 13830 1727204086.95753: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480/AnsiballZ_stat.py 13830 1727204086.95980: Sending initial data 13830 1727204086.95983: Sent initial data (152 bytes) 13830 1727204086.97063: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204086.97077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.97088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.97103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.97140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.97148: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204086.97160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.97183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204086.97195: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204086.97202: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204086.97210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204086.97220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204086.97234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204086.97238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204086.97245: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204086.97255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204086.97339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204086.97359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204086.97373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204086.97450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204086.99133: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204086.99167: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204086.99203: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp2yrjoo7o /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480/AnsiballZ_stat.py <<< 13830 1727204086.99237: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204087.00252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.00278: stderr chunk (state=3): >>><<< 13830 1727204087.00281: stdout chunk (state=3): >>><<< 13830 1727204087.00395: done transferring module to remote 13830 1727204087.00399: _low_level_execute_command(): starting 13830 1727204087.00404: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480/ /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480/AnsiballZ_stat.py && sleep 0' 13830 1727204087.01009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204087.01018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.01038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204087.01052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.01095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204087.01104: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204087.01114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.01128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204087.01147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204087.01154: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204087.01162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.01177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204087.01189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.01197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204087.01203: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204087.01213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.01294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.01313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204087.01324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.01395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.03077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.03150: stderr chunk (state=3): >>><<< 13830 1727204087.03154: stdout chunk (state=3): >>><<< 13830 1727204087.03176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204087.03180: _low_level_execute_command(): starting 13830 1727204087.03184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480/AnsiballZ_stat.py && sleep 0' 13830 1727204087.03872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.03916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.03921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.04249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.17209: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204087.18190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204087.18254: stderr chunk (state=3): >>><<< 13830 1727204087.18257: stdout chunk (state=3): >>><<< 13830 1727204087.18308: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204087.18313: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204087.18320: _low_level_execute_command(): starting 13830 1727204087.18322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204086.924138-15212-192906650614480/ > /dev/null 2>&1 && sleep 0' 13830 1727204087.18784: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.18788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204087.18798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.18833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204087.18837: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204087.18847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.18861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.18872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.18878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204087.18885: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204087.18890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.18940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.18965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204087.18968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.19022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.20811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.20895: stderr chunk (state=3): >>><<< 13830 1727204087.20899: stdout chunk (state=3): >>><<< 13830 1727204087.21220: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204087.21224: handler run complete 13830 1727204087.21226: attempt loop complete, returning result 13830 1727204087.21229: _execute() done 13830 1727204087.21234: dumping result to json 13830 1727204087.21237: done dumping result, returning 13830 1727204087.21240: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-1659-6b02-0000000004fd] 13830 1727204087.21242: sending task result for task 0affcd87-79f5-1659-6b02-0000000004fd 13830 1727204087.21319: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004fd 13830 1727204087.21323: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 13830 1727204087.21389: no more pending results, returning what we have 13830 1727204087.21393: results queue empty 13830 1727204087.21394: checking for any_errors_fatal 13830 1727204087.21399: done checking for any_errors_fatal 13830 1727204087.21400: checking for max_fail_percentage 13830 1727204087.21402: done checking for max_fail_percentage 13830 1727204087.21403: checking to see if all hosts have failed and the running result is not ok 13830 1727204087.21404: done checking to see if all hosts have failed 13830 1727204087.21405: getting the remaining hosts for this loop 13830 1727204087.21406: done getting the remaining hosts for this loop 13830 1727204087.21410: getting the next task for host managed-node3 13830 1727204087.21416: done getting next task for host managed-node3 13830 1727204087.21419: ^ task is: TASK: Set NM profile exist flag based on the profile files 13830 1727204087.21424: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204087.21428: getting variables 13830 1727204087.21429: in VariableManager get_vars() 13830 1727204087.21458: Calling all_inventory to load vars for managed-node3 13830 1727204087.21461: Calling groups_inventory to load vars for managed-node3 13830 1727204087.21466: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204087.21477: Calling all_plugins_play to load vars for managed-node3 13830 1727204087.21482: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204087.21486: Calling groups_plugins_play to load vars for managed-node3 13830 1727204087.22596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204087.23654: done with get_vars() 13830 1727204087.23682: done getting variables 13830 1727204087.23742: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.369) 0:00:20.316 ***** 13830 1727204087.23793: entering _queue_task() for managed-node3/set_fact 13830 1727204087.24166: worker is 1 (out of 1 available) 13830 1727204087.24184: exiting _queue_task() for managed-node3/set_fact 13830 1727204087.24202: done queuing things up, now waiting for results queue to drain 13830 1727204087.24203: waiting for pending results... 13830 1727204087.24517: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 13830 1727204087.24730: in run() - task 0affcd87-79f5-1659-6b02-0000000004fe 13830 1727204087.24734: variable 'ansible_search_path' from source: unknown 13830 1727204087.24737: variable 'ansible_search_path' from source: unknown 13830 1727204087.24740: calling self._execute() 13830 1727204087.24822: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.24838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.24854: variable 'omit' from source: magic vars 13830 1727204087.25262: variable 'ansible_distribution_major_version' from source: facts 13830 1727204087.25286: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204087.25427: variable 'profile_stat' from source: set_fact 13830 1727204087.25444: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204087.25452: when evaluation is False, skipping this task 13830 1727204087.25459: _execute() done 13830 1727204087.25468: dumping result to json 13830 1727204087.25476: done dumping result, returning 13830 1727204087.25490: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1659-6b02-0000000004fe] 13830 1727204087.25503: sending task result for task 0affcd87-79f5-1659-6b02-0000000004fe skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204087.25668: no more pending results, returning what we have 13830 1727204087.25673: results queue empty 13830 1727204087.25674: checking for any_errors_fatal 13830 1727204087.25683: done checking for any_errors_fatal 13830 1727204087.25684: checking for max_fail_percentage 13830 1727204087.25686: done checking for max_fail_percentage 13830 1727204087.25687: checking to see if all hosts have failed and the running result is not ok 13830 1727204087.25688: done checking to see if all hosts have failed 13830 1727204087.25689: getting the remaining hosts for this loop 13830 1727204087.25691: done getting the remaining hosts for this loop 13830 1727204087.25696: getting the next task for host managed-node3 13830 1727204087.25705: done getting next task for host managed-node3 13830 1727204087.25709: ^ task is: TASK: Get NM profile info 13830 1727204087.25715: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204087.25720: getting variables 13830 1727204087.25722: in VariableManager get_vars() 13830 1727204087.25758: Calling all_inventory to load vars for managed-node3 13830 1727204087.25761: Calling groups_inventory to load vars for managed-node3 13830 1727204087.25772: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204087.25786: Calling all_plugins_play to load vars for managed-node3 13830 1727204087.25789: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204087.25792: Calling groups_plugins_play to load vars for managed-node3 13830 1727204087.26406: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004fe 13830 1727204087.26410: WORKER PROCESS EXITING 13830 1727204087.26800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204087.27797: done with get_vars() 13830 1727204087.27824: done getting variables 13830 1727204087.27889: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.041) 0:00:20.357 ***** 13830 1727204087.27926: entering _queue_task() for managed-node3/shell 13830 1727204087.28266: worker is 1 (out of 1 available) 13830 1727204087.28279: exiting _queue_task() for managed-node3/shell 13830 1727204087.28291: done queuing things up, now waiting for results queue to drain 13830 1727204087.28292: waiting for pending results... 13830 1727204087.28580: running TaskExecutor() for managed-node3/TASK: Get NM profile info 13830 1727204087.28695: in run() - task 0affcd87-79f5-1659-6b02-0000000004ff 13830 1727204087.28711: variable 'ansible_search_path' from source: unknown 13830 1727204087.28715: variable 'ansible_search_path' from source: unknown 13830 1727204087.28766: calling self._execute() 13830 1727204087.28855: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.28859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.28871: variable 'omit' from source: magic vars 13830 1727204087.29237: variable 'ansible_distribution_major_version' from source: facts 13830 1727204087.29249: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204087.29254: variable 'omit' from source: magic vars 13830 1727204087.29313: variable 'omit' from source: magic vars 13830 1727204087.29384: variable 'profile' from source: include params 13830 1727204087.29389: variable 'bond_port_profile' from source: include params 13830 1727204087.29439: variable 'bond_port_profile' from source: include params 13830 1727204087.29453: variable 'omit' from source: magic vars 13830 1727204087.29489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204087.29518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204087.29536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204087.29549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204087.29559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204087.29584: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204087.29587: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.29589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.29661: Set connection var ansible_connection to ssh 13830 1727204087.29671: Set connection var ansible_timeout to 10 13830 1727204087.29676: Set connection var ansible_shell_executable to /bin/sh 13830 1727204087.29678: Set connection var ansible_shell_type to sh 13830 1727204087.29684: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204087.29691: Set connection var ansible_pipelining to False 13830 1727204087.29710: variable 'ansible_shell_executable' from source: unknown 13830 1727204087.29712: variable 'ansible_connection' from source: unknown 13830 1727204087.29719: variable 'ansible_module_compression' from source: unknown 13830 1727204087.29721: variable 'ansible_shell_type' from source: unknown 13830 1727204087.29724: variable 'ansible_shell_executable' from source: unknown 13830 1727204087.29726: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.29731: variable 'ansible_pipelining' from source: unknown 13830 1727204087.29734: variable 'ansible_timeout' from source: unknown 13830 1727204087.29736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.29835: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204087.29842: variable 'omit' from source: magic vars 13830 1727204087.29852: starting attempt loop 13830 1727204087.29855: running the handler 13830 1727204087.29860: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204087.29878: _low_level_execute_command(): starting 13830 1727204087.29884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204087.30415: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204087.30425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.30451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.30468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.30518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.30535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204087.30549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.30590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.32153: stdout chunk (state=3): >>>/root <<< 13830 1727204087.32257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.32313: stderr chunk (state=3): >>><<< 13830 1727204087.32316: stdout chunk (state=3): >>><<< 13830 1727204087.32339: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204087.32352: _low_level_execute_command(): starting 13830 1727204087.32358: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511 `" && echo ansible-tmp-1727204087.32339-15244-268402756843511="` echo /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511 `" ) && sleep 0' 13830 1727204087.32821: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.32835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.32865: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.32879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.32891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.32936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.32954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.32998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.34799: stdout chunk (state=3): >>>ansible-tmp-1727204087.32339-15244-268402756843511=/root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511 <<< 13830 1727204087.34906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.34966: stderr chunk (state=3): >>><<< 13830 1727204087.34970: stdout chunk (state=3): >>><<< 13830 1727204087.34986: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204087.32339-15244-268402756843511=/root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204087.35017: variable 'ansible_module_compression' from source: unknown 13830 1727204087.35060: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204087.35093: variable 'ansible_facts' from source: unknown 13830 1727204087.35145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511/AnsiballZ_command.py 13830 1727204087.35260: Sending initial data 13830 1727204087.35265: Sent initial data (154 bytes) 13830 1727204087.35969: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204087.35982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.36011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204087.36015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204087.36027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.36086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.36089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204087.36094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.36135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.37801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204087.37837: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204087.37876: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpnzmrb4kq /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511/AnsiballZ_command.py <<< 13830 1727204087.37913: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204087.38722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.38844: stderr chunk (state=3): >>><<< 13830 1727204087.38848: stdout chunk (state=3): >>><<< 13830 1727204087.38866: done transferring module to remote 13830 1727204087.38876: _low_level_execute_command(): starting 13830 1727204087.38881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511/ /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511/AnsiballZ_command.py && sleep 0' 13830 1727204087.39359: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.39367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.39396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.39408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.39461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.39480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.39529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.41207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.41261: stderr chunk (state=3): >>><<< 13830 1727204087.41267: stdout chunk (state=3): >>><<< 13830 1727204087.41284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204087.41288: _low_level_execute_command(): starting 13830 1727204087.41291: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511/AnsiballZ_command.py && sleep 0' 13830 1727204087.41749: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.41754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.41789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.41802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.41858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.41873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.41928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.57442: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:47.549554", "end": "2024-09-24 14:54:47.573294", "delta": "0:00:00.023740", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204087.58793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204087.58848: stderr chunk (state=3): >>><<< 13830 1727204087.58852: stdout chunk (state=3): >>><<< 13830 1727204087.59011: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:47.549554", "end": "2024-09-24 14:54:47.573294", "delta": "0:00:00.023740", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204087.59015: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204087.59018: _low_level_execute_command(): starting 13830 1727204087.59021: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204087.32339-15244-268402756843511/ > /dev/null 2>&1 && sleep 0' 13830 1727204087.59906: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204087.59909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204087.59945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.59948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204087.59950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204087.60024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204087.60034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204087.60037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204087.60090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204087.62048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204087.62052: stdout chunk (state=3): >>><<< 13830 1727204087.62054: stderr chunk (state=3): >>><<< 13830 1727204087.62373: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204087.62377: handler run complete 13830 1727204087.62379: Evaluated conditional (False): False 13830 1727204087.62381: attempt loop complete, returning result 13830 1727204087.62383: _execute() done 13830 1727204087.62385: dumping result to json 13830 1727204087.62387: done dumping result, returning 13830 1727204087.62389: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-1659-6b02-0000000004ff] 13830 1727204087.62391: sending task result for task 0affcd87-79f5-1659-6b02-0000000004ff 13830 1727204087.62473: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004ff 13830 1727204087.62477: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.023740", "end": "2024-09-24 14:54:47.573294", "rc": 0, "start": "2024-09-24 14:54:47.549554" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 13830 1727204087.62547: no more pending results, returning what we have 13830 1727204087.62550: results queue empty 13830 1727204087.62551: checking for any_errors_fatal 13830 1727204087.62556: done checking for any_errors_fatal 13830 1727204087.62557: checking for max_fail_percentage 13830 1727204087.62558: done checking for max_fail_percentage 13830 1727204087.62559: checking to see if all hosts have failed and the running result is not ok 13830 1727204087.62560: done checking to see if all hosts have failed 13830 1727204087.62560: getting the remaining hosts for this loop 13830 1727204087.62562: done getting the remaining hosts for this loop 13830 1727204087.62567: getting the next task for host managed-node3 13830 1727204087.62574: done getting next task for host managed-node3 13830 1727204087.62576: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13830 1727204087.62583: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204087.62586: getting variables 13830 1727204087.62587: in VariableManager get_vars() 13830 1727204087.62616: Calling all_inventory to load vars for managed-node3 13830 1727204087.62619: Calling groups_inventory to load vars for managed-node3 13830 1727204087.62623: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204087.62634: Calling all_plugins_play to load vars for managed-node3 13830 1727204087.62636: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204087.62640: Calling groups_plugins_play to load vars for managed-node3 13830 1727204087.64315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204087.66077: done with get_vars() 13830 1727204087.66108: done getting variables 13830 1727204087.66179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.382) 0:00:20.740 ***** 13830 1727204087.66216: entering _queue_task() for managed-node3/set_fact 13830 1727204087.66605: worker is 1 (out of 1 available) 13830 1727204087.66621: exiting _queue_task() for managed-node3/set_fact 13830 1727204087.66634: done queuing things up, now waiting for results queue to drain 13830 1727204087.66636: waiting for pending results... 13830 1727204087.66981: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13830 1727204087.67201: in run() - task 0affcd87-79f5-1659-6b02-000000000500 13830 1727204087.67229: variable 'ansible_search_path' from source: unknown 13830 1727204087.67240: variable 'ansible_search_path' from source: unknown 13830 1727204087.67299: calling self._execute() 13830 1727204087.67412: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.67431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.67448: variable 'omit' from source: magic vars 13830 1727204087.67836: variable 'ansible_distribution_major_version' from source: facts 13830 1727204087.67857: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204087.67991: variable 'nm_profile_exists' from source: set_fact 13830 1727204087.68011: Evaluated conditional (nm_profile_exists.rc == 0): True 13830 1727204087.68024: variable 'omit' from source: magic vars 13830 1727204087.68089: variable 'omit' from source: magic vars 13830 1727204087.68130: variable 'omit' from source: magic vars 13830 1727204087.68179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204087.68224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204087.68257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204087.68284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204087.68306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204087.68346: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204087.68358: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.68370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.68484: Set connection var ansible_connection to ssh 13830 1727204087.68502: Set connection var ansible_timeout to 10 13830 1727204087.68518: Set connection var ansible_shell_executable to /bin/sh 13830 1727204087.68524: Set connection var ansible_shell_type to sh 13830 1727204087.68533: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204087.68545: Set connection var ansible_pipelining to False 13830 1727204087.68575: variable 'ansible_shell_executable' from source: unknown 13830 1727204087.68582: variable 'ansible_connection' from source: unknown 13830 1727204087.68587: variable 'ansible_module_compression' from source: unknown 13830 1727204087.68593: variable 'ansible_shell_type' from source: unknown 13830 1727204087.68598: variable 'ansible_shell_executable' from source: unknown 13830 1727204087.68603: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.68609: variable 'ansible_pipelining' from source: unknown 13830 1727204087.68615: variable 'ansible_timeout' from source: unknown 13830 1727204087.68626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.68767: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204087.68788: variable 'omit' from source: magic vars 13830 1727204087.68797: starting attempt loop 13830 1727204087.68803: running the handler 13830 1727204087.68821: handler run complete 13830 1727204087.68834: attempt loop complete, returning result 13830 1727204087.68844: _execute() done 13830 1727204087.68849: dumping result to json 13830 1727204087.68855: done dumping result, returning 13830 1727204087.68867: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1659-6b02-000000000500] 13830 1727204087.68876: sending task result for task 0affcd87-79f5-1659-6b02-000000000500 ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13830 1727204087.69028: no more pending results, returning what we have 13830 1727204087.69032: results queue empty 13830 1727204087.69033: checking for any_errors_fatal 13830 1727204087.69041: done checking for any_errors_fatal 13830 1727204087.69042: checking for max_fail_percentage 13830 1727204087.69043: done checking for max_fail_percentage 13830 1727204087.69044: checking to see if all hosts have failed and the running result is not ok 13830 1727204087.69045: done checking to see if all hosts have failed 13830 1727204087.69046: getting the remaining hosts for this loop 13830 1727204087.69047: done getting the remaining hosts for this loop 13830 1727204087.69052: getting the next task for host managed-node3 13830 1727204087.69076: done getting next task for host managed-node3 13830 1727204087.69080: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13830 1727204087.69087: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204087.69091: getting variables 13830 1727204087.69095: in VariableManager get_vars() 13830 1727204087.69130: Calling all_inventory to load vars for managed-node3 13830 1727204087.69133: Calling groups_inventory to load vars for managed-node3 13830 1727204087.69137: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204087.69148: Calling all_plugins_play to load vars for managed-node3 13830 1727204087.69151: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204087.69153: Calling groups_plugins_play to load vars for managed-node3 13830 1727204087.71091: done sending task result for task 0affcd87-79f5-1659-6b02-000000000500 13830 1727204087.71097: WORKER PROCESS EXITING 13830 1727204087.72816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204087.76338: done with get_vars() 13830 1727204087.76370: done getting variables 13830 1727204087.76428: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204087.76952: variable 'profile' from source: include params 13830 1727204087.76957: variable 'bond_port_profile' from source: include params 13830 1727204087.77023: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.108) 0:00:20.848 ***** 13830 1727204087.77058: entering _queue_task() for managed-node3/command 13830 1727204087.77372: worker is 1 (out of 1 available) 13830 1727204087.77384: exiting _queue_task() for managed-node3/command 13830 1727204087.77397: done queuing things up, now waiting for results queue to drain 13830 1727204087.77399: waiting for pending results... 13830 1727204087.78627: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0 13830 1727204087.78772: in run() - task 0affcd87-79f5-1659-6b02-000000000502 13830 1727204087.78793: variable 'ansible_search_path' from source: unknown 13830 1727204087.78801: variable 'ansible_search_path' from source: unknown 13830 1727204087.78845: calling self._execute() 13830 1727204087.78938: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.78948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.78962: variable 'omit' from source: magic vars 13830 1727204087.79312: variable 'ansible_distribution_major_version' from source: facts 13830 1727204087.79742: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204087.79871: variable 'profile_stat' from source: set_fact 13830 1727204087.79886: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204087.79894: when evaluation is False, skipping this task 13830 1727204087.79900: _execute() done 13830 1727204087.79906: dumping result to json 13830 1727204087.79913: done dumping result, returning 13830 1727204087.79922: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affcd87-79f5-1659-6b02-000000000502] 13830 1727204087.79934: sending task result for task 0affcd87-79f5-1659-6b02-000000000502 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204087.80087: no more pending results, returning what we have 13830 1727204087.80091: results queue empty 13830 1727204087.80092: checking for any_errors_fatal 13830 1727204087.80099: done checking for any_errors_fatal 13830 1727204087.80100: checking for max_fail_percentage 13830 1727204087.80101: done checking for max_fail_percentage 13830 1727204087.80102: checking to see if all hosts have failed and the running result is not ok 13830 1727204087.80103: done checking to see if all hosts have failed 13830 1727204087.80104: getting the remaining hosts for this loop 13830 1727204087.80106: done getting the remaining hosts for this loop 13830 1727204087.80110: getting the next task for host managed-node3 13830 1727204087.80117: done getting next task for host managed-node3 13830 1727204087.80119: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13830 1727204087.80125: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204087.80129: getting variables 13830 1727204087.80131: in VariableManager get_vars() 13830 1727204087.80165: Calling all_inventory to load vars for managed-node3 13830 1727204087.80168: Calling groups_inventory to load vars for managed-node3 13830 1727204087.80171: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204087.80185: Calling all_plugins_play to load vars for managed-node3 13830 1727204087.80187: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204087.80190: Calling groups_plugins_play to load vars for managed-node3 13830 1727204087.80712: done sending task result for task 0affcd87-79f5-1659-6b02-000000000502 13830 1727204087.80715: WORKER PROCESS EXITING 13830 1727204087.82728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204087.86194: done with get_vars() 13830 1727204087.86228: done getting variables 13830 1727204087.86292: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204087.86413: variable 'profile' from source: include params 13830 1727204087.86417: variable 'bond_port_profile' from source: include params 13830 1727204087.86480: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.094) 0:00:20.943 ***** 13830 1727204087.86514: entering _queue_task() for managed-node3/set_fact 13830 1727204087.87252: worker is 1 (out of 1 available) 13830 1727204087.87334: exiting _queue_task() for managed-node3/set_fact 13830 1727204087.87347: done queuing things up, now waiting for results queue to drain 13830 1727204087.87348: waiting for pending results... 13830 1727204087.87955: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 13830 1727204087.88221: in run() - task 0affcd87-79f5-1659-6b02-000000000503 13830 1727204087.88248: variable 'ansible_search_path' from source: unknown 13830 1727204087.88257: variable 'ansible_search_path' from source: unknown 13830 1727204087.88308: calling self._execute() 13830 1727204087.88525: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204087.88540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204087.88554: variable 'omit' from source: magic vars 13830 1727204087.89340: variable 'ansible_distribution_major_version' from source: facts 13830 1727204087.89516: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204087.89772: variable 'profile_stat' from source: set_fact 13830 1727204087.89790: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204087.89799: when evaluation is False, skipping this task 13830 1727204087.89806: _execute() done 13830 1727204087.89814: dumping result to json 13830 1727204087.89827: done dumping result, returning 13830 1727204087.89841: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affcd87-79f5-1659-6b02-000000000503] 13830 1727204087.89946: sending task result for task 0affcd87-79f5-1659-6b02-000000000503 13830 1727204087.90065: done sending task result for task 0affcd87-79f5-1659-6b02-000000000503 13830 1727204087.90074: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204087.90220: no more pending results, returning what we have 13830 1727204087.90226: results queue empty 13830 1727204087.90227: checking for any_errors_fatal 13830 1727204087.90236: done checking for any_errors_fatal 13830 1727204087.90237: checking for max_fail_percentage 13830 1727204087.90239: done checking for max_fail_percentage 13830 1727204087.90240: checking to see if all hosts have failed and the running result is not ok 13830 1727204087.90242: done checking to see if all hosts have failed 13830 1727204087.90242: getting the remaining hosts for this loop 13830 1727204087.90245: done getting the remaining hosts for this loop 13830 1727204087.90249: getting the next task for host managed-node3 13830 1727204087.90259: done getting next task for host managed-node3 13830 1727204087.90261: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13830 1727204087.90269: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204087.90273: getting variables 13830 1727204087.90275: in VariableManager get_vars() 13830 1727204087.90308: Calling all_inventory to load vars for managed-node3 13830 1727204087.90311: Calling groups_inventory to load vars for managed-node3 13830 1727204087.90314: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204087.90327: Calling all_plugins_play to load vars for managed-node3 13830 1727204087.90329: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204087.90331: Calling groups_plugins_play to load vars for managed-node3 13830 1727204087.93294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204087.97009: done with get_vars() 13830 1727204087.97047: done getting variables 13830 1727204087.97113: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204087.97231: variable 'profile' from source: include params 13830 1727204087.97235: variable 'bond_port_profile' from source: include params 13830 1727204087.97500: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.110) 0:00:21.053 ***** 13830 1727204087.97536: entering _queue_task() for managed-node3/command 13830 1727204087.98015: worker is 1 (out of 1 available) 13830 1727204087.98029: exiting _queue_task() for managed-node3/command 13830 1727204087.98041: done queuing things up, now waiting for results queue to drain 13830 1727204087.98043: waiting for pending results... 13830 1727204087.99871: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0 13830 1727204088.00945: in run() - task 0affcd87-79f5-1659-6b02-000000000504 13830 1727204088.00973: variable 'ansible_search_path' from source: unknown 13830 1727204088.00981: variable 'ansible_search_path' from source: unknown 13830 1727204088.01022: calling self._execute() 13830 1727204088.01120: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.01134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.01149: variable 'omit' from source: magic vars 13830 1727204088.01621: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.02386: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.02528: variable 'profile_stat' from source: set_fact 13830 1727204088.02551: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204088.02561: when evaluation is False, skipping this task 13830 1727204088.02571: _execute() done 13830 1727204088.02578: dumping result to json 13830 1727204088.02585: done dumping result, returning 13830 1727204088.02595: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0affcd87-79f5-1659-6b02-000000000504] 13830 1727204088.02605: sending task result for task 0affcd87-79f5-1659-6b02-000000000504 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204088.02759: no more pending results, returning what we have 13830 1727204088.02767: results queue empty 13830 1727204088.02768: checking for any_errors_fatal 13830 1727204088.02775: done checking for any_errors_fatal 13830 1727204088.02776: checking for max_fail_percentage 13830 1727204088.02778: done checking for max_fail_percentage 13830 1727204088.02778: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.02779: done checking to see if all hosts have failed 13830 1727204088.02780: getting the remaining hosts for this loop 13830 1727204088.02782: done getting the remaining hosts for this loop 13830 1727204088.02786: getting the next task for host managed-node3 13830 1727204088.02795: done getting next task for host managed-node3 13830 1727204088.02797: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13830 1727204088.02803: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.02807: getting variables 13830 1727204088.02809: in VariableManager get_vars() 13830 1727204088.02844: Calling all_inventory to load vars for managed-node3 13830 1727204088.02846: Calling groups_inventory to load vars for managed-node3 13830 1727204088.02850: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.02868: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.02871: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.02874: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.03673: done sending task result for task 0affcd87-79f5-1659-6b02-000000000504 13830 1727204088.03677: WORKER PROCESS EXITING 13830 1727204088.06145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.09176: done with get_vars() 13830 1727204088.09209: done getting variables 13830 1727204088.09285: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204088.09413: variable 'profile' from source: include params 13830 1727204088.09417: variable 'bond_port_profile' from source: include params 13830 1727204088.09486: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.119) 0:00:21.173 ***** 13830 1727204088.09521: entering _queue_task() for managed-node3/set_fact 13830 1727204088.09868: worker is 1 (out of 1 available) 13830 1727204088.09883: exiting _queue_task() for managed-node3/set_fact 13830 1727204088.09897: done queuing things up, now waiting for results queue to drain 13830 1727204088.09899: waiting for pending results... 13830 1727204088.10600: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0 13830 1727204088.10777: in run() - task 0affcd87-79f5-1659-6b02-000000000505 13830 1727204088.10804: variable 'ansible_search_path' from source: unknown 13830 1727204088.10816: variable 'ansible_search_path' from source: unknown 13830 1727204088.10865: calling self._execute() 13830 1727204088.10979: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.10994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.11012: variable 'omit' from source: magic vars 13830 1727204088.11400: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.11424: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.11567: variable 'profile_stat' from source: set_fact 13830 1727204088.11585: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204088.11592: when evaluation is False, skipping this task 13830 1727204088.11599: _execute() done 13830 1727204088.11605: dumping result to json 13830 1727204088.11613: done dumping result, returning 13830 1727204088.11624: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affcd87-79f5-1659-6b02-000000000505] 13830 1727204088.11643: sending task result for task 0affcd87-79f5-1659-6b02-000000000505 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204088.11812: no more pending results, returning what we have 13830 1727204088.11816: results queue empty 13830 1727204088.11817: checking for any_errors_fatal 13830 1727204088.11825: done checking for any_errors_fatal 13830 1727204088.11825: checking for max_fail_percentage 13830 1727204088.11827: done checking for max_fail_percentage 13830 1727204088.11828: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.11832: done checking to see if all hosts have failed 13830 1727204088.11833: getting the remaining hosts for this loop 13830 1727204088.11835: done getting the remaining hosts for this loop 13830 1727204088.11840: getting the next task for host managed-node3 13830 1727204088.11852: done getting next task for host managed-node3 13830 1727204088.11855: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13830 1727204088.11864: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.11869: getting variables 13830 1727204088.11873: in VariableManager get_vars() 13830 1727204088.11912: Calling all_inventory to load vars for managed-node3 13830 1727204088.11915: Calling groups_inventory to load vars for managed-node3 13830 1727204088.11920: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.11937: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.11940: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.11943: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.12895: done sending task result for task 0affcd87-79f5-1659-6b02-000000000505 13830 1727204088.12898: WORKER PROCESS EXITING 13830 1727204088.14094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.16345: done with get_vars() 13830 1727204088.16380: done getting variables 13830 1727204088.16456: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204088.16588: variable 'profile' from source: include params 13830 1727204088.16592: variable 'bond_port_profile' from source: include params 13830 1727204088.16773: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.072) 0:00:21.246 ***** 13830 1727204088.16807: entering _queue_task() for managed-node3/assert 13830 1727204088.17726: worker is 1 (out of 1 available) 13830 1727204088.17743: exiting _queue_task() for managed-node3/assert 13830 1727204088.17755: done queuing things up, now waiting for results queue to drain 13830 1727204088.17757: waiting for pending results... 13830 1727204088.18000: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0' 13830 1727204088.18120: in run() - task 0affcd87-79f5-1659-6b02-0000000004da 13830 1727204088.18135: variable 'ansible_search_path' from source: unknown 13830 1727204088.18138: variable 'ansible_search_path' from source: unknown 13830 1727204088.18175: calling self._execute() 13830 1727204088.18271: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.18274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.18285: variable 'omit' from source: magic vars 13830 1727204088.18723: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.18745: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.18750: variable 'omit' from source: magic vars 13830 1727204088.18826: variable 'omit' from source: magic vars 13830 1727204088.18959: variable 'profile' from source: include params 13830 1727204088.18962: variable 'bond_port_profile' from source: include params 13830 1727204088.19066: variable 'bond_port_profile' from source: include params 13830 1727204088.19092: variable 'omit' from source: magic vars 13830 1727204088.19137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204088.19170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204088.19195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204088.19215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.19226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.19257: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204088.19260: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.19263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.19410: Set connection var ansible_connection to ssh 13830 1727204088.19422: Set connection var ansible_timeout to 10 13830 1727204088.19428: Set connection var ansible_shell_executable to /bin/sh 13830 1727204088.19435: Set connection var ansible_shell_type to sh 13830 1727204088.19437: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204088.19444: Set connection var ansible_pipelining to False 13830 1727204088.19512: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.19524: variable 'ansible_connection' from source: unknown 13830 1727204088.19529: variable 'ansible_module_compression' from source: unknown 13830 1727204088.19534: variable 'ansible_shell_type' from source: unknown 13830 1727204088.19536: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.19539: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.19572: variable 'ansible_pipelining' from source: unknown 13830 1727204088.19575: variable 'ansible_timeout' from source: unknown 13830 1727204088.19578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.20470: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204088.20474: variable 'omit' from source: magic vars 13830 1727204088.20477: starting attempt loop 13830 1727204088.20479: running the handler 13830 1727204088.20481: variable 'lsr_net_profile_exists' from source: set_fact 13830 1727204088.20484: Evaluated conditional (lsr_net_profile_exists): True 13830 1727204088.20486: handler run complete 13830 1727204088.20494: attempt loop complete, returning result 13830 1727204088.20496: _execute() done 13830 1727204088.20498: dumping result to json 13830 1727204088.20500: done dumping result, returning 13830 1727204088.20502: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0' [0affcd87-79f5-1659-6b02-0000000004da] 13830 1727204088.20504: sending task result for task 0affcd87-79f5-1659-6b02-0000000004da 13830 1727204088.20579: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004da 13830 1727204088.20582: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204088.20661: no more pending results, returning what we have 13830 1727204088.20666: results queue empty 13830 1727204088.20667: checking for any_errors_fatal 13830 1727204088.20702: done checking for any_errors_fatal 13830 1727204088.20704: checking for max_fail_percentage 13830 1727204088.20706: done checking for max_fail_percentage 13830 1727204088.20707: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.20707: done checking to see if all hosts have failed 13830 1727204088.20708: getting the remaining hosts for this loop 13830 1727204088.20710: done getting the remaining hosts for this loop 13830 1727204088.20713: getting the next task for host managed-node3 13830 1727204088.20720: done getting next task for host managed-node3 13830 1727204088.20722: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13830 1727204088.20727: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.20733: getting variables 13830 1727204088.20734: in VariableManager get_vars() 13830 1727204088.20762: Calling all_inventory to load vars for managed-node3 13830 1727204088.20766: Calling groups_inventory to load vars for managed-node3 13830 1727204088.20770: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.20779: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.20782: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.20785: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.22347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.23430: done with get_vars() 13830 1727204088.23453: done getting variables 13830 1727204088.23522: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204088.23673: variable 'profile' from source: include params 13830 1727204088.23678: variable 'bond_port_profile' from source: include params 13830 1727204088.23749: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.069) 0:00:21.315 ***** 13830 1727204088.23785: entering _queue_task() for managed-node3/assert 13830 1727204088.24164: worker is 1 (out of 1 available) 13830 1727204088.24178: exiting _queue_task() for managed-node3/assert 13830 1727204088.24189: done queuing things up, now waiting for results queue to drain 13830 1727204088.24191: waiting for pending results... 13830 1727204088.24807: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0' 13830 1727204088.24968: in run() - task 0affcd87-79f5-1659-6b02-0000000004db 13830 1727204088.24990: variable 'ansible_search_path' from source: unknown 13830 1727204088.25013: variable 'ansible_search_path' from source: unknown 13830 1727204088.25059: calling self._execute() 13830 1727204088.25172: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.25184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.25198: variable 'omit' from source: magic vars 13830 1727204088.25761: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.25765: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.25768: variable 'omit' from source: magic vars 13830 1727204088.25821: variable 'omit' from source: magic vars 13830 1727204088.25912: variable 'profile' from source: include params 13830 1727204088.25915: variable 'bond_port_profile' from source: include params 13830 1727204088.25957: variable 'bond_port_profile' from source: include params 13830 1727204088.25974: variable 'omit' from source: magic vars 13830 1727204088.26011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204088.26040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204088.26060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204088.26078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.26087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.26113: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204088.26116: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.26119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.26189: Set connection var ansible_connection to ssh 13830 1727204088.26198: Set connection var ansible_timeout to 10 13830 1727204088.26203: Set connection var ansible_shell_executable to /bin/sh 13830 1727204088.26208: Set connection var ansible_shell_type to sh 13830 1727204088.26210: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204088.26220: Set connection var ansible_pipelining to False 13830 1727204088.26242: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.26245: variable 'ansible_connection' from source: unknown 13830 1727204088.26248: variable 'ansible_module_compression' from source: unknown 13830 1727204088.26250: variable 'ansible_shell_type' from source: unknown 13830 1727204088.26253: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.26255: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.26257: variable 'ansible_pipelining' from source: unknown 13830 1727204088.26260: variable 'ansible_timeout' from source: unknown 13830 1727204088.26266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.26371: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204088.26381: variable 'omit' from source: magic vars 13830 1727204088.26386: starting attempt loop 13830 1727204088.26389: running the handler 13830 1727204088.26471: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13830 1727204088.26476: Evaluated conditional (lsr_net_profile_ansible_managed): True 13830 1727204088.26481: handler run complete 13830 1727204088.26493: attempt loop complete, returning result 13830 1727204088.26495: _execute() done 13830 1727204088.26498: dumping result to json 13830 1727204088.26500: done dumping result, returning 13830 1727204088.26507: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0affcd87-79f5-1659-6b02-0000000004db] 13830 1727204088.26512: sending task result for task 0affcd87-79f5-1659-6b02-0000000004db 13830 1727204088.26601: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004db 13830 1727204088.26604: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204088.26654: no more pending results, returning what we have 13830 1727204088.26657: results queue empty 13830 1727204088.26658: checking for any_errors_fatal 13830 1727204088.26665: done checking for any_errors_fatal 13830 1727204088.26666: checking for max_fail_percentage 13830 1727204088.26668: done checking for max_fail_percentage 13830 1727204088.26669: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.26669: done checking to see if all hosts have failed 13830 1727204088.26670: getting the remaining hosts for this loop 13830 1727204088.26672: done getting the remaining hosts for this loop 13830 1727204088.26676: getting the next task for host managed-node3 13830 1727204088.26682: done getting next task for host managed-node3 13830 1727204088.26685: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13830 1727204088.26690: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.26693: getting variables 13830 1727204088.26695: in VariableManager get_vars() 13830 1727204088.26728: Calling all_inventory to load vars for managed-node3 13830 1727204088.26736: Calling groups_inventory to load vars for managed-node3 13830 1727204088.26740: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.26755: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.26757: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.26761: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.28372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.29507: done with get_vars() 13830 1727204088.29526: done getting variables 13830 1727204088.29573: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204088.29665: variable 'profile' from source: include params 13830 1727204088.29669: variable 'bond_port_profile' from source: include params 13830 1727204088.29712: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.059) 0:00:21.375 ***** 13830 1727204088.29738: entering _queue_task() for managed-node3/assert 13830 1727204088.29978: worker is 1 (out of 1 available) 13830 1727204088.29993: exiting _queue_task() for managed-node3/assert 13830 1727204088.30006: done queuing things up, now waiting for results queue to drain 13830 1727204088.30008: waiting for pending results... 13830 1727204088.30187: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0 13830 1727204088.30277: in run() - task 0affcd87-79f5-1659-6b02-0000000004dc 13830 1727204088.30287: variable 'ansible_search_path' from source: unknown 13830 1727204088.30291: variable 'ansible_search_path' from source: unknown 13830 1727204088.30319: calling self._execute() 13830 1727204088.30395: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.30399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.30408: variable 'omit' from source: magic vars 13830 1727204088.30756: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.30781: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.30791: variable 'omit' from source: magic vars 13830 1727204088.30852: variable 'omit' from source: magic vars 13830 1727204088.30961: variable 'profile' from source: include params 13830 1727204088.30973: variable 'bond_port_profile' from source: include params 13830 1727204088.31052: variable 'bond_port_profile' from source: include params 13830 1727204088.31078: variable 'omit' from source: magic vars 13830 1727204088.31125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204088.31173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204088.31204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204088.31227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.31248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.31289: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204088.31298: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.31312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.31414: Set connection var ansible_connection to ssh 13830 1727204088.31437: Set connection var ansible_timeout to 10 13830 1727204088.31448: Set connection var ansible_shell_executable to /bin/sh 13830 1727204088.31459: Set connection var ansible_shell_type to sh 13830 1727204088.31470: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204088.31488: Set connection var ansible_pipelining to False 13830 1727204088.31513: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.31520: variable 'ansible_connection' from source: unknown 13830 1727204088.31526: variable 'ansible_module_compression' from source: unknown 13830 1727204088.31536: variable 'ansible_shell_type' from source: unknown 13830 1727204088.31546: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.31552: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.31558: variable 'ansible_pipelining' from source: unknown 13830 1727204088.31575: variable 'ansible_timeout' from source: unknown 13830 1727204088.31587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.31723: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204088.31732: variable 'omit' from source: magic vars 13830 1727204088.31740: starting attempt loop 13830 1727204088.31742: running the handler 13830 1727204088.31858: variable 'lsr_net_profile_fingerprint' from source: set_fact 13830 1727204088.31873: Evaluated conditional (lsr_net_profile_fingerprint): True 13830 1727204088.31883: handler run complete 13830 1727204088.31905: attempt loop complete, returning result 13830 1727204088.31916: _execute() done 13830 1727204088.31921: dumping result to json 13830 1727204088.31928: done dumping result, returning 13830 1727204088.31937: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0 [0affcd87-79f5-1659-6b02-0000000004dc] 13830 1727204088.31945: sending task result for task 0affcd87-79f5-1659-6b02-0000000004dc 13830 1727204088.32054: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004dc 13830 1727204088.32062: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204088.32130: no more pending results, returning what we have 13830 1727204088.32134: results queue empty 13830 1727204088.32135: checking for any_errors_fatal 13830 1727204088.32140: done checking for any_errors_fatal 13830 1727204088.32141: checking for max_fail_percentage 13830 1727204088.32143: done checking for max_fail_percentage 13830 1727204088.32144: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.32145: done checking to see if all hosts have failed 13830 1727204088.32146: getting the remaining hosts for this loop 13830 1727204088.32147: done getting the remaining hosts for this loop 13830 1727204088.32159: getting the next task for host managed-node3 13830 1727204088.32172: done getting next task for host managed-node3 13830 1727204088.32174: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13830 1727204088.32179: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.32185: getting variables 13830 1727204088.32186: in VariableManager get_vars() 13830 1727204088.32223: Calling all_inventory to load vars for managed-node3 13830 1727204088.32226: Calling groups_inventory to load vars for managed-node3 13830 1727204088.32230: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.32243: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.32246: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.32249: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.33602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.38342: done with get_vars() 13830 1727204088.38361: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.086) 0:00:21.462 ***** 13830 1727204088.38430: entering _queue_task() for managed-node3/include_tasks 13830 1727204088.38666: worker is 1 (out of 1 available) 13830 1727204088.38679: exiting _queue_task() for managed-node3/include_tasks 13830 1727204088.38691: done queuing things up, now waiting for results queue to drain 13830 1727204088.38692: waiting for pending results... 13830 1727204088.38875: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 13830 1727204088.38982: in run() - task 0affcd87-79f5-1659-6b02-0000000004e0 13830 1727204088.38992: variable 'ansible_search_path' from source: unknown 13830 1727204088.38996: variable 'ansible_search_path' from source: unknown 13830 1727204088.39027: calling self._execute() 13830 1727204088.39100: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.39104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.39111: variable 'omit' from source: magic vars 13830 1727204088.39401: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.39412: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.39418: _execute() done 13830 1727204088.39421: dumping result to json 13830 1727204088.39423: done dumping result, returning 13830 1727204088.39430: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1659-6b02-0000000004e0] 13830 1727204088.39439: sending task result for task 0affcd87-79f5-1659-6b02-0000000004e0 13830 1727204088.39536: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004e0 13830 1727204088.39539: WORKER PROCESS EXITING 13830 1727204088.39575: no more pending results, returning what we have 13830 1727204088.39580: in VariableManager get_vars() 13830 1727204088.39617: Calling all_inventory to load vars for managed-node3 13830 1727204088.39620: Calling groups_inventory to load vars for managed-node3 13830 1727204088.39623: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.39637: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.39640: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.39643: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.40448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.42075: done with get_vars() 13830 1727204088.42093: variable 'ansible_search_path' from source: unknown 13830 1727204088.42094: variable 'ansible_search_path' from source: unknown 13830 1727204088.42123: we have included files to process 13830 1727204088.42124: generating all_blocks data 13830 1727204088.42126: done generating all_blocks data 13830 1727204088.42132: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204088.42133: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204088.42134: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204088.42767: done processing included file 13830 1727204088.42768: iterating over new_blocks loaded from include file 13830 1727204088.42769: in VariableManager get_vars() 13830 1727204088.42782: done with get_vars() 13830 1727204088.42783: filtering new block on tags 13830 1727204088.42836: done filtering new block on tags 13830 1727204088.42838: in VariableManager get_vars() 13830 1727204088.42849: done with get_vars() 13830 1727204088.42850: filtering new block on tags 13830 1727204088.42887: done filtering new block on tags 13830 1727204088.42889: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 13830 1727204088.42893: extending task lists for all hosts with included blocks 13830 1727204088.43162: done extending task lists 13830 1727204088.43163: done processing included files 13830 1727204088.43165: results queue empty 13830 1727204088.43166: checking for any_errors_fatal 13830 1727204088.43168: done checking for any_errors_fatal 13830 1727204088.43169: checking for max_fail_percentage 13830 1727204088.43170: done checking for max_fail_percentage 13830 1727204088.43170: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.43171: done checking to see if all hosts have failed 13830 1727204088.43171: getting the remaining hosts for this loop 13830 1727204088.43172: done getting the remaining hosts for this loop 13830 1727204088.43174: getting the next task for host managed-node3 13830 1727204088.43177: done getting next task for host managed-node3 13830 1727204088.43178: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13830 1727204088.43181: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.43183: getting variables 13830 1727204088.43183: in VariableManager get_vars() 13830 1727204088.43189: Calling all_inventory to load vars for managed-node3 13830 1727204088.43191: Calling groups_inventory to load vars for managed-node3 13830 1727204088.43192: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.43196: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.43198: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.43199: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.44579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.46305: done with get_vars() 13830 1727204088.46336: done getting variables 13830 1727204088.46385: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.079) 0:00:21.542 ***** 13830 1727204088.46422: entering _queue_task() for managed-node3/set_fact 13830 1727204088.46761: worker is 1 (out of 1 available) 13830 1727204088.46773: exiting _queue_task() for managed-node3/set_fact 13830 1727204088.46786: done queuing things up, now waiting for results queue to drain 13830 1727204088.46787: waiting for pending results... 13830 1727204088.47086: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13830 1727204088.47220: in run() - task 0affcd87-79f5-1659-6b02-000000000558 13830 1727204088.47240: variable 'ansible_search_path' from source: unknown 13830 1727204088.47244: variable 'ansible_search_path' from source: unknown 13830 1727204088.47280: calling self._execute() 13830 1727204088.47375: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.47379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.47388: variable 'omit' from source: magic vars 13830 1727204088.47770: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.47788: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.47794: variable 'omit' from source: magic vars 13830 1727204088.47858: variable 'omit' from source: magic vars 13830 1727204088.47893: variable 'omit' from source: magic vars 13830 1727204088.47931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204088.47968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204088.47987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204088.48006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.48018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.48048: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204088.48051: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.48054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.48150: Set connection var ansible_connection to ssh 13830 1727204088.48162: Set connection var ansible_timeout to 10 13830 1727204088.48168: Set connection var ansible_shell_executable to /bin/sh 13830 1727204088.48171: Set connection var ansible_shell_type to sh 13830 1727204088.48177: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204088.48187: Set connection var ansible_pipelining to False 13830 1727204088.48215: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.48218: variable 'ansible_connection' from source: unknown 13830 1727204088.48222: variable 'ansible_module_compression' from source: unknown 13830 1727204088.48224: variable 'ansible_shell_type' from source: unknown 13830 1727204088.48227: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.48229: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.48231: variable 'ansible_pipelining' from source: unknown 13830 1727204088.48237: variable 'ansible_timeout' from source: unknown 13830 1727204088.48240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.48391: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204088.48402: variable 'omit' from source: magic vars 13830 1727204088.48407: starting attempt loop 13830 1727204088.48411: running the handler 13830 1727204088.48424: handler run complete 13830 1727204088.48443: attempt loop complete, returning result 13830 1727204088.48446: _execute() done 13830 1727204088.48448: dumping result to json 13830 1727204088.48451: done dumping result, returning 13830 1727204088.48457: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1659-6b02-000000000558] 13830 1727204088.48462: sending task result for task 0affcd87-79f5-1659-6b02-000000000558 13830 1727204088.48552: done sending task result for task 0affcd87-79f5-1659-6b02-000000000558 13830 1727204088.48556: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13830 1727204088.48619: no more pending results, returning what we have 13830 1727204088.48623: results queue empty 13830 1727204088.48624: checking for any_errors_fatal 13830 1727204088.48627: done checking for any_errors_fatal 13830 1727204088.48628: checking for max_fail_percentage 13830 1727204088.48632: done checking for max_fail_percentage 13830 1727204088.48634: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.48635: done checking to see if all hosts have failed 13830 1727204088.48635: getting the remaining hosts for this loop 13830 1727204088.48637: done getting the remaining hosts for this loop 13830 1727204088.48642: getting the next task for host managed-node3 13830 1727204088.48653: done getting next task for host managed-node3 13830 1727204088.48655: ^ task is: TASK: Stat profile file 13830 1727204088.48665: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.48669: getting variables 13830 1727204088.48671: in VariableManager get_vars() 13830 1727204088.48709: Calling all_inventory to load vars for managed-node3 13830 1727204088.48712: Calling groups_inventory to load vars for managed-node3 13830 1727204088.48716: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.48728: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.48734: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.48737: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.50759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.52563: done with get_vars() 13830 1727204088.52590: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.062) 0:00:21.604 ***** 13830 1727204088.52670: entering _queue_task() for managed-node3/stat 13830 1727204088.52902: worker is 1 (out of 1 available) 13830 1727204088.52916: exiting _queue_task() for managed-node3/stat 13830 1727204088.52927: done queuing things up, now waiting for results queue to drain 13830 1727204088.52928: waiting for pending results... 13830 1727204088.53110: running TaskExecutor() for managed-node3/TASK: Stat profile file 13830 1727204088.53212: in run() - task 0affcd87-79f5-1659-6b02-000000000559 13830 1727204088.53375: variable 'ansible_search_path' from source: unknown 13830 1727204088.53380: variable 'ansible_search_path' from source: unknown 13830 1727204088.53383: calling self._execute() 13830 1727204088.53386: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.53389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.53391: variable 'omit' from source: magic vars 13830 1727204088.53754: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.53765: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.53773: variable 'omit' from source: magic vars 13830 1727204088.53836: variable 'omit' from source: magic vars 13830 1727204088.53929: variable 'profile' from source: include params 13830 1727204088.53932: variable 'bond_port_profile' from source: include params 13830 1727204088.53998: variable 'bond_port_profile' from source: include params 13830 1727204088.54018: variable 'omit' from source: magic vars 13830 1727204088.54062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204088.54098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204088.54118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204088.54138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.54149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.54720: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204088.54731: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.54745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.54881: Set connection var ansible_connection to ssh 13830 1727204088.54898: Set connection var ansible_timeout to 10 13830 1727204088.54909: Set connection var ansible_shell_executable to /bin/sh 13830 1727204088.54915: Set connection var ansible_shell_type to sh 13830 1727204088.54926: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204088.54940: Set connection var ansible_pipelining to False 13830 1727204088.54977: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.54985: variable 'ansible_connection' from source: unknown 13830 1727204088.54992: variable 'ansible_module_compression' from source: unknown 13830 1727204088.54999: variable 'ansible_shell_type' from source: unknown 13830 1727204088.55006: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.55012: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.55019: variable 'ansible_pipelining' from source: unknown 13830 1727204088.55025: variable 'ansible_timeout' from source: unknown 13830 1727204088.55032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.55246: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204088.55261: variable 'omit' from source: magic vars 13830 1727204088.55272: starting attempt loop 13830 1727204088.55279: running the handler 13830 1727204088.55302: _low_level_execute_command(): starting 13830 1727204088.55314: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204088.56142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204088.56162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.56185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.56209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.56253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.56271: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204088.56286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.56308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204088.56320: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204088.56331: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204088.56344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.56357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.56376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.56391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.56403: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204088.56421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.56494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204088.56517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204088.56541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204088.57130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204088.58355: stdout chunk (state=3): >>>/root <<< 13830 1727204088.58549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204088.58552: stdout chunk (state=3): >>><<< 13830 1727204088.58555: stderr chunk (state=3): >>><<< 13830 1727204088.58684: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204088.58687: _low_level_execute_command(): starting 13830 1727204088.58690: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686 `" && echo ansible-tmp-1727204088.5858657-15301-134321565575686="` echo /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686 `" ) && sleep 0' 13830 1727204088.59500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204088.59518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.59537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.59567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.59613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.59628: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204088.59645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.59667: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204088.59685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204088.59699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204088.59713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.59728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.59746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.59760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.59777: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204088.59798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.59878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204088.59909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204088.59929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204088.60017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204088.61981: stdout chunk (state=3): >>>ansible-tmp-1727204088.5858657-15301-134321565575686=/root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686 <<< 13830 1727204088.62157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204088.62244: stderr chunk (state=3): >>><<< 13830 1727204088.62256: stdout chunk (state=3): >>><<< 13830 1727204088.62575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204088.5858657-15301-134321565575686=/root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204088.62579: variable 'ansible_module_compression' from source: unknown 13830 1727204088.62581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204088.62584: variable 'ansible_facts' from source: unknown 13830 1727204088.62586: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686/AnsiballZ_stat.py 13830 1727204088.62692: Sending initial data 13830 1727204088.62695: Sent initial data (153 bytes) 13830 1727204088.63658: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204088.63682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.63697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.63713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.63754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.63768: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204088.63798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.63821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204088.63835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204088.63860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204088.63876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.63914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.63945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.63956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.63971: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204088.63986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.64098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204088.64130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204088.64146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204088.64245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204088.66111: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204088.66151: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204088.66194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp9qkchuh9 /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686/AnsiballZ_stat.py <<< 13830 1727204088.66243: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204088.67787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204088.67997: stderr chunk (state=3): >>><<< 13830 1727204088.68000: stdout chunk (state=3): >>><<< 13830 1727204088.68002: done transferring module to remote 13830 1727204088.68005: _low_level_execute_command(): starting 13830 1727204088.68013: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686/ /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686/AnsiballZ_stat.py && sleep 0' 13830 1727204088.68735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204088.68750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.68768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.68787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.68828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.68842: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204088.68856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.68877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204088.68888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204088.68898: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204088.68910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.68923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.68941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.68953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.68967: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204088.68981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.69057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204088.69082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204088.69098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204088.69176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204088.71025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204088.71133: stderr chunk (state=3): >>><<< 13830 1727204088.71136: stdout chunk (state=3): >>><<< 13830 1727204088.71161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204088.71166: _low_level_execute_command(): starting 13830 1727204088.71169: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686/AnsiballZ_stat.py && sleep 0' 13830 1727204088.71854: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204088.71882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.71900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.71928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.71995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.72008: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204088.72026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.72045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204088.72076: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204088.72090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204088.72101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.72110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.72125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.72135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.72138: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204088.72148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.72256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204088.72263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204088.72268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204088.72351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204088.85400: stdout chunk (state=3): >>> <<< 13830 1727204088.85408: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204088.86437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204088.86441: stdout chunk (state=3): >>><<< 13830 1727204088.86443: stderr chunk (state=3): >>><<< 13830 1727204088.86460: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204088.86492: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204088.86501: _low_level_execute_command(): starting 13830 1727204088.86506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204088.5858657-15301-134321565575686/ > /dev/null 2>&1 && sleep 0' 13830 1727204088.87146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204088.87154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.87167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.87181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.87220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.87226: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204088.87236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.87249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204088.87256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204088.87263: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204088.87275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204088.87284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204088.87294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204088.87301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204088.87308: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204088.87317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204088.87389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204088.87403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204088.87413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204088.87484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204088.89311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204088.89314: stdout chunk (state=3): >>><<< 13830 1727204088.89317: stderr chunk (state=3): >>><<< 13830 1727204088.89727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204088.89733: handler run complete 13830 1727204088.89736: attempt loop complete, returning result 13830 1727204088.89738: _execute() done 13830 1727204088.89740: dumping result to json 13830 1727204088.89742: done dumping result, returning 13830 1727204088.89745: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-1659-6b02-000000000559] 13830 1727204088.89747: sending task result for task 0affcd87-79f5-1659-6b02-000000000559 13830 1727204088.89820: done sending task result for task 0affcd87-79f5-1659-6b02-000000000559 13830 1727204088.89823: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 13830 1727204088.89886: no more pending results, returning what we have 13830 1727204088.89889: results queue empty 13830 1727204088.89890: checking for any_errors_fatal 13830 1727204088.89896: done checking for any_errors_fatal 13830 1727204088.89896: checking for max_fail_percentage 13830 1727204088.89898: done checking for max_fail_percentage 13830 1727204088.89899: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.89900: done checking to see if all hosts have failed 13830 1727204088.89901: getting the remaining hosts for this loop 13830 1727204088.89902: done getting the remaining hosts for this loop 13830 1727204088.89906: getting the next task for host managed-node3 13830 1727204088.89913: done getting next task for host managed-node3 13830 1727204088.89916: ^ task is: TASK: Set NM profile exist flag based on the profile files 13830 1727204088.89922: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.89925: getting variables 13830 1727204088.89926: in VariableManager get_vars() 13830 1727204088.89957: Calling all_inventory to load vars for managed-node3 13830 1727204088.89960: Calling groups_inventory to load vars for managed-node3 13830 1727204088.89965: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.89980: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.89982: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.89986: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.91770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.92692: done with get_vars() 13830 1727204088.92709: done getting variables 13830 1727204088.92755: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.401) 0:00:22.005 ***** 13830 1727204088.92784: entering _queue_task() for managed-node3/set_fact 13830 1727204088.93014: worker is 1 (out of 1 available) 13830 1727204088.93027: exiting _queue_task() for managed-node3/set_fact 13830 1727204088.93041: done queuing things up, now waiting for results queue to drain 13830 1727204088.93043: waiting for pending results... 13830 1727204088.93230: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 13830 1727204088.93406: in run() - task 0affcd87-79f5-1659-6b02-00000000055a 13830 1727204088.93447: variable 'ansible_search_path' from source: unknown 13830 1727204088.93456: variable 'ansible_search_path' from source: unknown 13830 1727204088.93502: calling self._execute() 13830 1727204088.93600: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.93618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.93633: variable 'omit' from source: magic vars 13830 1727204088.94024: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.94046: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.94236: variable 'profile_stat' from source: set_fact 13830 1727204088.94277: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204088.94296: when evaluation is False, skipping this task 13830 1727204088.94308: _execute() done 13830 1727204088.94314: dumping result to json 13830 1727204088.94321: done dumping result, returning 13830 1727204088.94330: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1659-6b02-00000000055a] 13830 1727204088.94341: sending task result for task 0affcd87-79f5-1659-6b02-00000000055a 13830 1727204088.94473: done sending task result for task 0affcd87-79f5-1659-6b02-00000000055a 13830 1727204088.94476: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204088.94554: no more pending results, returning what we have 13830 1727204088.94559: results queue empty 13830 1727204088.94560: checking for any_errors_fatal 13830 1727204088.94570: done checking for any_errors_fatal 13830 1727204088.94571: checking for max_fail_percentage 13830 1727204088.94573: done checking for max_fail_percentage 13830 1727204088.94574: checking to see if all hosts have failed and the running result is not ok 13830 1727204088.94574: done checking to see if all hosts have failed 13830 1727204088.94575: getting the remaining hosts for this loop 13830 1727204088.94577: done getting the remaining hosts for this loop 13830 1727204088.94581: getting the next task for host managed-node3 13830 1727204088.94588: done getting next task for host managed-node3 13830 1727204088.94591: ^ task is: TASK: Get NM profile info 13830 1727204088.94598: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204088.94602: getting variables 13830 1727204088.94604: in VariableManager get_vars() 13830 1727204088.94656: Calling all_inventory to load vars for managed-node3 13830 1727204088.94659: Calling groups_inventory to load vars for managed-node3 13830 1727204088.94662: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204088.94674: Calling all_plugins_play to load vars for managed-node3 13830 1727204088.94677: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204088.94680: Calling groups_plugins_play to load vars for managed-node3 13830 1727204088.95499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204088.96797: done with get_vars() 13830 1727204088.96821: done getting variables 13830 1727204088.96887: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.041) 0:00:22.047 ***** 13830 1727204088.96922: entering _queue_task() for managed-node3/shell 13830 1727204088.97255: worker is 1 (out of 1 available) 13830 1727204088.97270: exiting _queue_task() for managed-node3/shell 13830 1727204088.97281: done queuing things up, now waiting for results queue to drain 13830 1727204088.97283: waiting for pending results... 13830 1727204088.98212: running TaskExecutor() for managed-node3/TASK: Get NM profile info 13830 1727204088.98726: in run() - task 0affcd87-79f5-1659-6b02-00000000055b 13830 1727204088.98777: variable 'ansible_search_path' from source: unknown 13830 1727204088.98787: variable 'ansible_search_path' from source: unknown 13830 1727204088.98802: calling self._execute() 13830 1727204088.98899: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.98908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.98912: variable 'omit' from source: magic vars 13830 1727204088.99184: variable 'ansible_distribution_major_version' from source: facts 13830 1727204088.99195: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204088.99200: variable 'omit' from source: magic vars 13830 1727204088.99248: variable 'omit' from source: magic vars 13830 1727204088.99319: variable 'profile' from source: include params 13830 1727204088.99323: variable 'bond_port_profile' from source: include params 13830 1727204088.99374: variable 'bond_port_profile' from source: include params 13830 1727204088.99390: variable 'omit' from source: magic vars 13830 1727204088.99424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204088.99456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204088.99474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204088.99488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.99499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204088.99521: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204088.99525: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.99527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.99600: Set connection var ansible_connection to ssh 13830 1727204088.99609: Set connection var ansible_timeout to 10 13830 1727204088.99614: Set connection var ansible_shell_executable to /bin/sh 13830 1727204088.99617: Set connection var ansible_shell_type to sh 13830 1727204088.99622: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204088.99629: Set connection var ansible_pipelining to False 13830 1727204088.99648: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.99651: variable 'ansible_connection' from source: unknown 13830 1727204088.99654: variable 'ansible_module_compression' from source: unknown 13830 1727204088.99658: variable 'ansible_shell_type' from source: unknown 13830 1727204088.99660: variable 'ansible_shell_executable' from source: unknown 13830 1727204088.99663: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204088.99665: variable 'ansible_pipelining' from source: unknown 13830 1727204088.99669: variable 'ansible_timeout' from source: unknown 13830 1727204088.99672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204088.99772: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204088.99783: variable 'omit' from source: magic vars 13830 1727204088.99786: starting attempt loop 13830 1727204088.99789: running the handler 13830 1727204088.99799: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204088.99815: _low_level_execute_command(): starting 13830 1727204088.99823: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204089.00970: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204089.00990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204089.01604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204089.02671: stdout chunk (state=3): >>>/root <<< 13830 1727204089.02846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204089.02850: stdout chunk (state=3): >>><<< 13830 1727204089.02860: stderr chunk (state=3): >>><<< 13830 1727204089.02885: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204089.02901: _low_level_execute_command(): starting 13830 1727204089.02905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890 `" && echo ansible-tmp-1727204089.0288599-15328-157080283527890="` echo /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890 `" ) && sleep 0' 13830 1727204089.03619: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.03624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.03679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.03682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204089.03695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.03701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.03713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204089.03718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.03801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204089.03819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204089.03887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204089.05799: stdout chunk (state=3): >>>ansible-tmp-1727204089.0288599-15328-157080283527890=/root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890 <<< 13830 1727204089.05991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204089.05995: stdout chunk (state=3): >>><<< 13830 1727204089.06000: stderr chunk (state=3): >>><<< 13830 1727204089.06030: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204089.0288599-15328-157080283527890=/root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204089.06061: variable 'ansible_module_compression' from source: unknown 13830 1727204089.06118: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204089.06156: variable 'ansible_facts' from source: unknown 13830 1727204089.06223: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890/AnsiballZ_command.py 13830 1727204089.06410: Sending initial data 13830 1727204089.06413: Sent initial data (156 bytes) 13830 1727204089.07424: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204089.07438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.07448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204089.07463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.07504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204089.07514: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204089.07527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.07543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204089.07555: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204089.07558: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204089.07567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.07578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204089.07590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.07598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204089.07604: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204089.07616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.07695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204089.07713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204089.07724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204089.07806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204089.09655: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204089.09694: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204089.09738: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp9wdyxtt4 /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890/AnsiballZ_command.py <<< 13830 1727204089.09777: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204089.10956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204089.10963: stderr chunk (state=3): >>><<< 13830 1727204089.10979: stdout chunk (state=3): >>><<< 13830 1727204089.11000: done transferring module to remote 13830 1727204089.11011: _low_level_execute_command(): starting 13830 1727204089.11016: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890/ /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890/AnsiballZ_command.py && sleep 0' 13830 1727204089.11660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.11666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.11711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.11719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204089.11722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.11777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204089.11785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204089.11836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204089.13755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204089.13758: stdout chunk (state=3): >>><<< 13830 1727204089.13761: stderr chunk (state=3): >>><<< 13830 1727204089.13877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204089.13880: _low_level_execute_command(): starting 13830 1727204089.13883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890/AnsiballZ_command.py && sleep 0' 13830 1727204089.14472: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204089.14486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.14500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204089.14522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.14568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204089.14580: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204089.14593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.14609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204089.14620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204089.14642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204089.14655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.14671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204089.14688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.14701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204089.14717: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204089.14759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.14908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204089.14928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204089.14946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204089.15043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204089.31161: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:49.287525", "end": "2024-09-24 14:54:49.310567", "delta": "0:00:00.023042", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204089.32484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204089.32488: stdout chunk (state=3): >>><<< 13830 1727204089.32490: stderr chunk (state=3): >>><<< 13830 1727204089.32570: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:49.287525", "end": "2024-09-24 14:54:49.310567", "delta": "0:00:00.023042", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204089.32574: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204089.32580: _low_level_execute_command(): starting 13830 1727204089.32582: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204089.0288599-15328-157080283527890/ > /dev/null 2>&1 && sleep 0' 13830 1727204089.33310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204089.33326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.33351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204089.33374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.33419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204089.33434: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204089.33453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.33480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204089.33493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204089.33504: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204089.33516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204089.33533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204089.33551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204089.33568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204089.33586: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204089.33600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204089.33684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204089.33711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204089.33729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204089.33809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204089.35612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204089.35728: stderr chunk (state=3): >>><<< 13830 1727204089.35756: stdout chunk (state=3): >>><<< 13830 1727204089.35875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204089.35878: handler run complete 13830 1727204089.35881: Evaluated conditional (False): False 13830 1727204089.35883: attempt loop complete, returning result 13830 1727204089.35885: _execute() done 13830 1727204089.35887: dumping result to json 13830 1727204089.35889: done dumping result, returning 13830 1727204089.35891: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-1659-6b02-00000000055b] 13830 1727204089.35894: sending task result for task 0affcd87-79f5-1659-6b02-00000000055b 13830 1727204089.36162: done sending task result for task 0affcd87-79f5-1659-6b02-00000000055b 13830 1727204089.36167: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023042", "end": "2024-09-24 14:54:49.310567", "rc": 0, "start": "2024-09-24 14:54:49.287525" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 13830 1727204089.36257: no more pending results, returning what we have 13830 1727204089.36261: results queue empty 13830 1727204089.36262: checking for any_errors_fatal 13830 1727204089.36272: done checking for any_errors_fatal 13830 1727204089.36273: checking for max_fail_percentage 13830 1727204089.36276: done checking for max_fail_percentage 13830 1727204089.36277: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.36277: done checking to see if all hosts have failed 13830 1727204089.36278: getting the remaining hosts for this loop 13830 1727204089.36280: done getting the remaining hosts for this loop 13830 1727204089.36284: getting the next task for host managed-node3 13830 1727204089.36294: done getting next task for host managed-node3 13830 1727204089.36296: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13830 1727204089.36303: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.36306: getting variables 13830 1727204089.36308: in VariableManager get_vars() 13830 1727204089.36346: Calling all_inventory to load vars for managed-node3 13830 1727204089.36349: Calling groups_inventory to load vars for managed-node3 13830 1727204089.36352: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.36366: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.36368: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.36372: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.38326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.40030: done with get_vars() 13830 1727204089.40054: done getting variables 13830 1727204089.40116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.432) 0:00:22.479 ***** 13830 1727204089.40155: entering _queue_task() for managed-node3/set_fact 13830 1727204089.40468: worker is 1 (out of 1 available) 13830 1727204089.40479: exiting _queue_task() for managed-node3/set_fact 13830 1727204089.40490: done queuing things up, now waiting for results queue to drain 13830 1727204089.40492: waiting for pending results... 13830 1727204089.40786: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13830 1727204089.40908: in run() - task 0affcd87-79f5-1659-6b02-00000000055c 13830 1727204089.40922: variable 'ansible_search_path' from source: unknown 13830 1727204089.40925: variable 'ansible_search_path' from source: unknown 13830 1727204089.40967: calling self._execute() 13830 1727204089.41049: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.41055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.41068: variable 'omit' from source: magic vars 13830 1727204089.41450: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.41463: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.41605: variable 'nm_profile_exists' from source: set_fact 13830 1727204089.41620: Evaluated conditional (nm_profile_exists.rc == 0): True 13830 1727204089.41625: variable 'omit' from source: magic vars 13830 1727204089.41688: variable 'omit' from source: magic vars 13830 1727204089.41727: variable 'omit' from source: magic vars 13830 1727204089.41771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204089.41810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204089.41834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204089.41851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.41861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.41894: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204089.41897: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.41899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.42001: Set connection var ansible_connection to ssh 13830 1727204089.42012: Set connection var ansible_timeout to 10 13830 1727204089.42213: Set connection var ansible_shell_executable to /bin/sh 13830 1727204089.42216: Set connection var ansible_shell_type to sh 13830 1727204089.42218: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204089.42221: Set connection var ansible_pipelining to False 13830 1727204089.42223: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.42225: variable 'ansible_connection' from source: unknown 13830 1727204089.42227: variable 'ansible_module_compression' from source: unknown 13830 1727204089.42229: variable 'ansible_shell_type' from source: unknown 13830 1727204089.42234: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.42237: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.42240: variable 'ansible_pipelining' from source: unknown 13830 1727204089.42242: variable 'ansible_timeout' from source: unknown 13830 1727204089.42245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.42269: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204089.42273: variable 'omit' from source: magic vars 13830 1727204089.42275: starting attempt loop 13830 1727204089.42277: running the handler 13830 1727204089.42279: handler run complete 13830 1727204089.42282: attempt loop complete, returning result 13830 1727204089.42284: _execute() done 13830 1727204089.42286: dumping result to json 13830 1727204089.42288: done dumping result, returning 13830 1727204089.42445: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1659-6b02-00000000055c] 13830 1727204089.42448: sending task result for task 0affcd87-79f5-1659-6b02-00000000055c 13830 1727204089.42515: done sending task result for task 0affcd87-79f5-1659-6b02-00000000055c 13830 1727204089.42518: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13830 1727204089.42574: no more pending results, returning what we have 13830 1727204089.42577: results queue empty 13830 1727204089.42578: checking for any_errors_fatal 13830 1727204089.42585: done checking for any_errors_fatal 13830 1727204089.42586: checking for max_fail_percentage 13830 1727204089.42588: done checking for max_fail_percentage 13830 1727204089.42588: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.42589: done checking to see if all hosts have failed 13830 1727204089.42590: getting the remaining hosts for this loop 13830 1727204089.42592: done getting the remaining hosts for this loop 13830 1727204089.42596: getting the next task for host managed-node3 13830 1727204089.42605: done getting next task for host managed-node3 13830 1727204089.42608: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13830 1727204089.42614: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.42617: getting variables 13830 1727204089.42618: in VariableManager get_vars() 13830 1727204089.42649: Calling all_inventory to load vars for managed-node3 13830 1727204089.42653: Calling groups_inventory to load vars for managed-node3 13830 1727204089.42656: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.42669: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.42671: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.42675: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.44997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.46884: done with get_vars() 13830 1727204089.46918: done getting variables 13830 1727204089.46980: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204089.47100: variable 'profile' from source: include params 13830 1727204089.47104: variable 'bond_port_profile' from source: include params 13830 1727204089.47162: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.070) 0:00:22.550 ***** 13830 1727204089.47198: entering _queue_task() for managed-node3/command 13830 1727204089.48103: worker is 1 (out of 1 available) 13830 1727204089.48115: exiting _queue_task() for managed-node3/command 13830 1727204089.48126: done queuing things up, now waiting for results queue to drain 13830 1727204089.48127: waiting for pending results... 13830 1727204089.48401: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 13830 1727204089.48522: in run() - task 0affcd87-79f5-1659-6b02-00000000055e 13830 1727204089.48536: variable 'ansible_search_path' from source: unknown 13830 1727204089.48540: variable 'ansible_search_path' from source: unknown 13830 1727204089.48576: calling self._execute() 13830 1727204089.48671: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.48677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.48692: variable 'omit' from source: magic vars 13830 1727204089.49071: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.49083: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.49206: variable 'profile_stat' from source: set_fact 13830 1727204089.49218: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204089.49221: when evaluation is False, skipping this task 13830 1727204089.49229: _execute() done 13830 1727204089.49234: dumping result to json 13830 1727204089.49237: done dumping result, returning 13830 1727204089.49243: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affcd87-79f5-1659-6b02-00000000055e] 13830 1727204089.49249: sending task result for task 0affcd87-79f5-1659-6b02-00000000055e skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204089.49397: no more pending results, returning what we have 13830 1727204089.49402: results queue empty 13830 1727204089.49402: checking for any_errors_fatal 13830 1727204089.49412: done checking for any_errors_fatal 13830 1727204089.49413: checking for max_fail_percentage 13830 1727204089.49415: done checking for max_fail_percentage 13830 1727204089.49416: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.49416: done checking to see if all hosts have failed 13830 1727204089.49417: getting the remaining hosts for this loop 13830 1727204089.49419: done getting the remaining hosts for this loop 13830 1727204089.49423: getting the next task for host managed-node3 13830 1727204089.49431: done getting next task for host managed-node3 13830 1727204089.49434: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13830 1727204089.49441: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.49445: getting variables 13830 1727204089.49446: in VariableManager get_vars() 13830 1727204089.49482: Calling all_inventory to load vars for managed-node3 13830 1727204089.49485: Calling groups_inventory to load vars for managed-node3 13830 1727204089.49489: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.49495: done sending task result for task 0affcd87-79f5-1659-6b02-00000000055e 13830 1727204089.49501: WORKER PROCESS EXITING 13830 1727204089.49514: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.49517: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.49521: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.51419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.54742: done with get_vars() 13830 1727204089.54780: done getting variables 13830 1727204089.54845: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204089.54968: variable 'profile' from source: include params 13830 1727204089.54972: variable 'bond_port_profile' from source: include params 13830 1727204089.55035: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.078) 0:00:22.631 ***** 13830 1727204089.55299: entering _queue_task() for managed-node3/set_fact 13830 1727204089.55682: worker is 1 (out of 1 available) 13830 1727204089.55692: exiting _queue_task() for managed-node3/set_fact 13830 1727204089.55705: done queuing things up, now waiting for results queue to drain 13830 1727204089.55707: waiting for pending results... 13830 1727204089.55991: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 13830 1727204089.56128: in run() - task 0affcd87-79f5-1659-6b02-00000000055f 13830 1727204089.56142: variable 'ansible_search_path' from source: unknown 13830 1727204089.56148: variable 'ansible_search_path' from source: unknown 13830 1727204089.56191: calling self._execute() 13830 1727204089.56287: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.56291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.56301: variable 'omit' from source: magic vars 13830 1727204089.56665: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.56678: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.56816: variable 'profile_stat' from source: set_fact 13830 1727204089.56828: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204089.56834: when evaluation is False, skipping this task 13830 1727204089.56837: _execute() done 13830 1727204089.56840: dumping result to json 13830 1727204089.56843: done dumping result, returning 13830 1727204089.56846: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affcd87-79f5-1659-6b02-00000000055f] 13830 1727204089.56852: sending task result for task 0affcd87-79f5-1659-6b02-00000000055f 13830 1727204089.56951: done sending task result for task 0affcd87-79f5-1659-6b02-00000000055f 13830 1727204089.56954: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204089.57009: no more pending results, returning what we have 13830 1727204089.57013: results queue empty 13830 1727204089.57014: checking for any_errors_fatal 13830 1727204089.57022: done checking for any_errors_fatal 13830 1727204089.57023: checking for max_fail_percentage 13830 1727204089.57025: done checking for max_fail_percentage 13830 1727204089.57026: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.57027: done checking to see if all hosts have failed 13830 1727204089.57027: getting the remaining hosts for this loop 13830 1727204089.57029: done getting the remaining hosts for this loop 13830 1727204089.57033: getting the next task for host managed-node3 13830 1727204089.57042: done getting next task for host managed-node3 13830 1727204089.57045: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13830 1727204089.57051: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.57055: getting variables 13830 1727204089.57056: in VariableManager get_vars() 13830 1727204089.57093: Calling all_inventory to load vars for managed-node3 13830 1727204089.57097: Calling groups_inventory to load vars for managed-node3 13830 1727204089.57101: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.57114: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.57117: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.57119: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.59584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.61215: done with get_vars() 13830 1727204089.61244: done getting variables 13830 1727204089.61313: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204089.61439: variable 'profile' from source: include params 13830 1727204089.61443: variable 'bond_port_profile' from source: include params 13830 1727204089.61605: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.063) 0:00:22.694 ***** 13830 1727204089.61641: entering _queue_task() for managed-node3/command 13830 1727204089.62170: worker is 1 (out of 1 available) 13830 1727204089.62182: exiting _queue_task() for managed-node3/command 13830 1727204089.62193: done queuing things up, now waiting for results queue to drain 13830 1727204089.62195: waiting for pending results... 13830 1727204089.62945: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 13830 1727204089.63156: in run() - task 0affcd87-79f5-1659-6b02-000000000560 13830 1727204089.63172: variable 'ansible_search_path' from source: unknown 13830 1727204089.63176: variable 'ansible_search_path' from source: unknown 13830 1727204089.63219: calling self._execute() 13830 1727204089.63319: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.63324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.63335: variable 'omit' from source: magic vars 13830 1727204089.63713: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.63733: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.63853: variable 'profile_stat' from source: set_fact 13830 1727204089.63863: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204089.63869: when evaluation is False, skipping this task 13830 1727204089.63872: _execute() done 13830 1727204089.63875: dumping result to json 13830 1727204089.63877: done dumping result, returning 13830 1727204089.63885: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affcd87-79f5-1659-6b02-000000000560] 13830 1727204089.63888: sending task result for task 0affcd87-79f5-1659-6b02-000000000560 13830 1727204089.64007: done sending task result for task 0affcd87-79f5-1659-6b02-000000000560 13830 1727204089.64010: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204089.64065: no more pending results, returning what we have 13830 1727204089.64071: results queue empty 13830 1727204089.64072: checking for any_errors_fatal 13830 1727204089.64081: done checking for any_errors_fatal 13830 1727204089.64082: checking for max_fail_percentage 13830 1727204089.64084: done checking for max_fail_percentage 13830 1727204089.64085: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.64085: done checking to see if all hosts have failed 13830 1727204089.64086: getting the remaining hosts for this loop 13830 1727204089.64088: done getting the remaining hosts for this loop 13830 1727204089.64092: getting the next task for host managed-node3 13830 1727204089.64100: done getting next task for host managed-node3 13830 1727204089.64102: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13830 1727204089.64108: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.64112: getting variables 13830 1727204089.64114: in VariableManager get_vars() 13830 1727204089.64149: Calling all_inventory to load vars for managed-node3 13830 1727204089.64152: Calling groups_inventory to load vars for managed-node3 13830 1727204089.64155: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.64168: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.64171: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.64174: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.65034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.67528: done with get_vars() 13830 1727204089.67590: done getting variables 13830 1727204089.67658: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204089.67759: variable 'profile' from source: include params 13830 1727204089.67763: variable 'bond_port_profile' from source: include params 13830 1727204089.67818: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.062) 0:00:22.756 ***** 13830 1727204089.67846: entering _queue_task() for managed-node3/set_fact 13830 1727204089.68082: worker is 1 (out of 1 available) 13830 1727204089.68094: exiting _queue_task() for managed-node3/set_fact 13830 1727204089.68108: done queuing things up, now waiting for results queue to drain 13830 1727204089.68109: waiting for pending results... 13830 1727204089.68290: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 13830 1727204089.68380: in run() - task 0affcd87-79f5-1659-6b02-000000000561 13830 1727204089.68394: variable 'ansible_search_path' from source: unknown 13830 1727204089.68397: variable 'ansible_search_path' from source: unknown 13830 1727204089.68428: calling self._execute() 13830 1727204089.68506: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.68510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.68518: variable 'omit' from source: magic vars 13830 1727204089.68789: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.68801: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.68890: variable 'profile_stat' from source: set_fact 13830 1727204089.68899: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204089.68902: when evaluation is False, skipping this task 13830 1727204089.68905: _execute() done 13830 1727204089.68907: dumping result to json 13830 1727204089.68909: done dumping result, returning 13830 1727204089.68917: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affcd87-79f5-1659-6b02-000000000561] 13830 1727204089.68926: sending task result for task 0affcd87-79f5-1659-6b02-000000000561 13830 1727204089.69008: done sending task result for task 0affcd87-79f5-1659-6b02-000000000561 13830 1727204089.69012: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204089.69066: no more pending results, returning what we have 13830 1727204089.69070: results queue empty 13830 1727204089.69071: checking for any_errors_fatal 13830 1727204089.69078: done checking for any_errors_fatal 13830 1727204089.69079: checking for max_fail_percentage 13830 1727204089.69081: done checking for max_fail_percentage 13830 1727204089.69082: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.69083: done checking to see if all hosts have failed 13830 1727204089.69083: getting the remaining hosts for this loop 13830 1727204089.69085: done getting the remaining hosts for this loop 13830 1727204089.69089: getting the next task for host managed-node3 13830 1727204089.69097: done getting next task for host managed-node3 13830 1727204089.69100: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13830 1727204089.69105: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.69110: getting variables 13830 1727204089.69111: in VariableManager get_vars() 13830 1727204089.69152: Calling all_inventory to load vars for managed-node3 13830 1727204089.69154: Calling groups_inventory to load vars for managed-node3 13830 1727204089.69157: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.69169: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.69171: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.69174: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.72083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.73387: done with get_vars() 13830 1727204089.73412: done getting variables 13830 1727204089.73474: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204089.73595: variable 'profile' from source: include params 13830 1727204089.73602: variable 'bond_port_profile' from source: include params 13830 1727204089.73668: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.058) 0:00:22.815 ***** 13830 1727204089.73701: entering _queue_task() for managed-node3/assert 13830 1727204089.74034: worker is 1 (out of 1 available) 13830 1727204089.74048: exiting _queue_task() for managed-node3/assert 13830 1727204089.74060: done queuing things up, now waiting for results queue to drain 13830 1727204089.74062: waiting for pending results... 13830 1727204089.74420: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.0' 13830 1727204089.74660: in run() - task 0affcd87-79f5-1659-6b02-0000000004e1 13830 1727204089.74671: variable 'ansible_search_path' from source: unknown 13830 1727204089.74675: variable 'ansible_search_path' from source: unknown 13830 1727204089.74678: calling self._execute() 13830 1727204089.74790: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.74795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.74921: variable 'omit' from source: magic vars 13830 1727204089.75623: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.75636: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.75640: variable 'omit' from source: magic vars 13830 1727204089.75742: variable 'omit' from source: magic vars 13830 1727204089.75866: variable 'profile' from source: include params 13830 1727204089.75870: variable 'bond_port_profile' from source: include params 13830 1727204089.75987: variable 'bond_port_profile' from source: include params 13830 1727204089.76005: variable 'omit' from source: magic vars 13830 1727204089.76083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204089.76112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204089.76129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204089.76150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.76158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.76186: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204089.76190: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.76192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.76291: Set connection var ansible_connection to ssh 13830 1727204089.76300: Set connection var ansible_timeout to 10 13830 1727204089.76305: Set connection var ansible_shell_executable to /bin/sh 13830 1727204089.76308: Set connection var ansible_shell_type to sh 13830 1727204089.76313: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204089.76321: Set connection var ansible_pipelining to False 13830 1727204089.76340: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.76343: variable 'ansible_connection' from source: unknown 13830 1727204089.76346: variable 'ansible_module_compression' from source: unknown 13830 1727204089.76348: variable 'ansible_shell_type' from source: unknown 13830 1727204089.76352: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.76354: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.76356: variable 'ansible_pipelining' from source: unknown 13830 1727204089.76359: variable 'ansible_timeout' from source: unknown 13830 1727204089.76361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.76472: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204089.76483: variable 'omit' from source: magic vars 13830 1727204089.76491: starting attempt loop 13830 1727204089.76494: running the handler 13830 1727204089.76570: variable 'lsr_net_profile_exists' from source: set_fact 13830 1727204089.76573: Evaluated conditional (lsr_net_profile_exists): True 13830 1727204089.76580: handler run complete 13830 1727204089.76593: attempt loop complete, returning result 13830 1727204089.76597: _execute() done 13830 1727204089.76599: dumping result to json 13830 1727204089.76602: done dumping result, returning 13830 1727204089.76607: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.0' [0affcd87-79f5-1659-6b02-0000000004e1] 13830 1727204089.76611: sending task result for task 0affcd87-79f5-1659-6b02-0000000004e1 13830 1727204089.76701: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004e1 13830 1727204089.76704: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204089.76754: no more pending results, returning what we have 13830 1727204089.76758: results queue empty 13830 1727204089.76759: checking for any_errors_fatal 13830 1727204089.76765: done checking for any_errors_fatal 13830 1727204089.76766: checking for max_fail_percentage 13830 1727204089.76768: done checking for max_fail_percentage 13830 1727204089.76769: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.76770: done checking to see if all hosts have failed 13830 1727204089.76770: getting the remaining hosts for this loop 13830 1727204089.76772: done getting the remaining hosts for this loop 13830 1727204089.76776: getting the next task for host managed-node3 13830 1727204089.76783: done getting next task for host managed-node3 13830 1727204089.76785: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13830 1727204089.76789: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.76794: getting variables 13830 1727204089.76795: in VariableManager get_vars() 13830 1727204089.76833: Calling all_inventory to load vars for managed-node3 13830 1727204089.76836: Calling groups_inventory to load vars for managed-node3 13830 1727204089.76840: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.76849: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.76851: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.76854: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.77770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.79307: done with get_vars() 13830 1727204089.79325: done getting variables 13830 1727204089.79372: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204089.79465: variable 'profile' from source: include params 13830 1727204089.79468: variable 'bond_port_profile' from source: include params 13830 1727204089.79509: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.058) 0:00:22.873 ***** 13830 1727204089.79536: entering _queue_task() for managed-node3/assert 13830 1727204089.79773: worker is 1 (out of 1 available) 13830 1727204089.79785: exiting _queue_task() for managed-node3/assert 13830 1727204089.79797: done queuing things up, now waiting for results queue to drain 13830 1727204089.79799: waiting for pending results... 13830 1727204089.79983: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 13830 1727204089.80062: in run() - task 0affcd87-79f5-1659-6b02-0000000004e2 13830 1727204089.80077: variable 'ansible_search_path' from source: unknown 13830 1727204089.80080: variable 'ansible_search_path' from source: unknown 13830 1727204089.80111: calling self._execute() 13830 1727204089.80187: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.80191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.80199: variable 'omit' from source: magic vars 13830 1727204089.80476: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.80487: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.80493: variable 'omit' from source: magic vars 13830 1727204089.80532: variable 'omit' from source: magic vars 13830 1727204089.80605: variable 'profile' from source: include params 13830 1727204089.80608: variable 'bond_port_profile' from source: include params 13830 1727204089.80657: variable 'bond_port_profile' from source: include params 13830 1727204089.80674: variable 'omit' from source: magic vars 13830 1727204089.80708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204089.80737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204089.80759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204089.80770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.80780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.80803: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204089.80806: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.80809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.80885: Set connection var ansible_connection to ssh 13830 1727204089.80894: Set connection var ansible_timeout to 10 13830 1727204089.80899: Set connection var ansible_shell_executable to /bin/sh 13830 1727204089.80902: Set connection var ansible_shell_type to sh 13830 1727204089.80906: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204089.80914: Set connection var ansible_pipelining to False 13830 1727204089.80930: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.80935: variable 'ansible_connection' from source: unknown 13830 1727204089.80938: variable 'ansible_module_compression' from source: unknown 13830 1727204089.80940: variable 'ansible_shell_type' from source: unknown 13830 1727204089.80943: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.80945: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.80949: variable 'ansible_pipelining' from source: unknown 13830 1727204089.80952: variable 'ansible_timeout' from source: unknown 13830 1727204089.80956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.81057: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204089.81067: variable 'omit' from source: magic vars 13830 1727204089.81072: starting attempt loop 13830 1727204089.81076: running the handler 13830 1727204089.81154: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13830 1727204089.81158: Evaluated conditional (lsr_net_profile_ansible_managed): True 13830 1727204089.81165: handler run complete 13830 1727204089.81176: attempt loop complete, returning result 13830 1727204089.81179: _execute() done 13830 1727204089.81181: dumping result to json 13830 1727204089.81183: done dumping result, returning 13830 1727204089.81191: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affcd87-79f5-1659-6b02-0000000004e2] 13830 1727204089.81195: sending task result for task 0affcd87-79f5-1659-6b02-0000000004e2 13830 1727204089.81278: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004e2 13830 1727204089.81281: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204089.81366: no more pending results, returning what we have 13830 1727204089.81369: results queue empty 13830 1727204089.81370: checking for any_errors_fatal 13830 1727204089.81376: done checking for any_errors_fatal 13830 1727204089.81377: checking for max_fail_percentage 13830 1727204089.81378: done checking for max_fail_percentage 13830 1727204089.81379: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.81380: done checking to see if all hosts have failed 13830 1727204089.81381: getting the remaining hosts for this loop 13830 1727204089.81382: done getting the remaining hosts for this loop 13830 1727204089.81386: getting the next task for host managed-node3 13830 1727204089.81392: done getting next task for host managed-node3 13830 1727204089.81394: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13830 1727204089.81399: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.81407: getting variables 13830 1727204089.81408: in VariableManager get_vars() 13830 1727204089.81438: Calling all_inventory to load vars for managed-node3 13830 1727204089.81441: Calling groups_inventory to load vars for managed-node3 13830 1727204089.81444: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.81452: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.81454: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.81457: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.82237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.83169: done with get_vars() 13830 1727204089.83187: done getting variables 13830 1727204089.83231: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204089.83319: variable 'profile' from source: include params 13830 1727204089.83322: variable 'bond_port_profile' from source: include params 13830 1727204089.83363: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.038) 0:00:22.911 ***** 13830 1727204089.83389: entering _queue_task() for managed-node3/assert 13830 1727204089.83611: worker is 1 (out of 1 available) 13830 1727204089.83626: exiting _queue_task() for managed-node3/assert 13830 1727204089.83637: done queuing things up, now waiting for results queue to drain 13830 1727204089.83639: waiting for pending results... 13830 1727204089.83821: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.0 13830 1727204089.83910: in run() - task 0affcd87-79f5-1659-6b02-0000000004e3 13830 1727204089.83923: variable 'ansible_search_path' from source: unknown 13830 1727204089.83927: variable 'ansible_search_path' from source: unknown 13830 1727204089.83959: calling self._execute() 13830 1727204089.84032: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.84039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.84048: variable 'omit' from source: magic vars 13830 1727204089.84309: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.84319: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.84325: variable 'omit' from source: magic vars 13830 1727204089.84367: variable 'omit' from source: magic vars 13830 1727204089.84436: variable 'profile' from source: include params 13830 1727204089.84440: variable 'bond_port_profile' from source: include params 13830 1727204089.84489: variable 'bond_port_profile' from source: include params 13830 1727204089.84503: variable 'omit' from source: magic vars 13830 1727204089.84540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204089.84570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204089.84590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204089.84601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.84611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.84638: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204089.84641: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.84644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.84714: Set connection var ansible_connection to ssh 13830 1727204089.84723: Set connection var ansible_timeout to 10 13830 1727204089.84727: Set connection var ansible_shell_executable to /bin/sh 13830 1727204089.84729: Set connection var ansible_shell_type to sh 13830 1727204089.84738: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204089.84746: Set connection var ansible_pipelining to False 13830 1727204089.84762: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.84767: variable 'ansible_connection' from source: unknown 13830 1727204089.84769: variable 'ansible_module_compression' from source: unknown 13830 1727204089.84772: variable 'ansible_shell_type' from source: unknown 13830 1727204089.84774: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.84776: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.84781: variable 'ansible_pipelining' from source: unknown 13830 1727204089.84784: variable 'ansible_timeout' from source: unknown 13830 1727204089.84786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.84892: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204089.84900: variable 'omit' from source: magic vars 13830 1727204089.84905: starting attempt loop 13830 1727204089.84908: running the handler 13830 1727204089.84985: variable 'lsr_net_profile_fingerprint' from source: set_fact 13830 1727204089.84989: Evaluated conditional (lsr_net_profile_fingerprint): True 13830 1727204089.84999: handler run complete 13830 1727204089.85009: attempt loop complete, returning result 13830 1727204089.85011: _execute() done 13830 1727204089.85014: dumping result to json 13830 1727204089.85016: done dumping result, returning 13830 1727204089.85023: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0affcd87-79f5-1659-6b02-0000000004e3] 13830 1727204089.85028: sending task result for task 0affcd87-79f5-1659-6b02-0000000004e3 13830 1727204089.85114: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004e3 13830 1727204089.85117: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204089.85169: no more pending results, returning what we have 13830 1727204089.85173: results queue empty 13830 1727204089.85174: checking for any_errors_fatal 13830 1727204089.85181: done checking for any_errors_fatal 13830 1727204089.85182: checking for max_fail_percentage 13830 1727204089.85184: done checking for max_fail_percentage 13830 1727204089.85185: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.85185: done checking to see if all hosts have failed 13830 1727204089.85186: getting the remaining hosts for this loop 13830 1727204089.85188: done getting the remaining hosts for this loop 13830 1727204089.85192: getting the next task for host managed-node3 13830 1727204089.85203: done getting next task for host managed-node3 13830 1727204089.85205: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13830 1727204089.85215: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.85219: getting variables 13830 1727204089.85221: in VariableManager get_vars() 13830 1727204089.85256: Calling all_inventory to load vars for managed-node3 13830 1727204089.85258: Calling groups_inventory to load vars for managed-node3 13830 1727204089.85261: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.85273: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.85275: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.85278: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.86217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.87130: done with get_vars() 13830 1727204089.87147: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.038) 0:00:22.950 ***** 13830 1727204089.87225: entering _queue_task() for managed-node3/include_tasks 13830 1727204089.87451: worker is 1 (out of 1 available) 13830 1727204089.87466: exiting _queue_task() for managed-node3/include_tasks 13830 1727204089.87477: done queuing things up, now waiting for results queue to drain 13830 1727204089.87479: waiting for pending results... 13830 1727204089.87654: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 13830 1727204089.87734: in run() - task 0affcd87-79f5-1659-6b02-0000000004e7 13830 1727204089.87745: variable 'ansible_search_path' from source: unknown 13830 1727204089.87748: variable 'ansible_search_path' from source: unknown 13830 1727204089.87778: calling self._execute() 13830 1727204089.87855: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.87860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.87871: variable 'omit' from source: magic vars 13830 1727204089.88128: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.88141: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.88144: _execute() done 13830 1727204089.88147: dumping result to json 13830 1727204089.88152: done dumping result, returning 13830 1727204089.88154: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-1659-6b02-0000000004e7] 13830 1727204089.88162: sending task result for task 0affcd87-79f5-1659-6b02-0000000004e7 13830 1727204089.88248: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004e7 13830 1727204089.88251: WORKER PROCESS EXITING 13830 1727204089.88293: no more pending results, returning what we have 13830 1727204089.88298: in VariableManager get_vars() 13830 1727204089.88338: Calling all_inventory to load vars for managed-node3 13830 1727204089.88341: Calling groups_inventory to load vars for managed-node3 13830 1727204089.88344: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.88355: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.88357: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.88364: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.89155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.90068: done with get_vars() 13830 1727204089.90084: variable 'ansible_search_path' from source: unknown 13830 1727204089.90085: variable 'ansible_search_path' from source: unknown 13830 1727204089.90112: we have included files to process 13830 1727204089.90113: generating all_blocks data 13830 1727204089.90115: done generating all_blocks data 13830 1727204089.90118: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204089.90118: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204089.90120: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13830 1727204089.90738: done processing included file 13830 1727204089.90739: iterating over new_blocks loaded from include file 13830 1727204089.90741: in VariableManager get_vars() 13830 1727204089.90753: done with get_vars() 13830 1727204089.90754: filtering new block on tags 13830 1727204089.90803: done filtering new block on tags 13830 1727204089.90805: in VariableManager get_vars() 13830 1727204089.90815: done with get_vars() 13830 1727204089.90816: filtering new block on tags 13830 1727204089.90857: done filtering new block on tags 13830 1727204089.90859: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 13830 1727204089.90863: extending task lists for all hosts with included blocks 13830 1727204089.91122: done extending task lists 13830 1727204089.91124: done processing included files 13830 1727204089.91124: results queue empty 13830 1727204089.91124: checking for any_errors_fatal 13830 1727204089.91127: done checking for any_errors_fatal 13830 1727204089.91127: checking for max_fail_percentage 13830 1727204089.91128: done checking for max_fail_percentage 13830 1727204089.91128: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.91129: done checking to see if all hosts have failed 13830 1727204089.91129: getting the remaining hosts for this loop 13830 1727204089.91132: done getting the remaining hosts for this loop 13830 1727204089.91134: getting the next task for host managed-node3 13830 1727204089.91137: done getting next task for host managed-node3 13830 1727204089.91138: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13830 1727204089.91141: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.91142: getting variables 13830 1727204089.91143: in VariableManager get_vars() 13830 1727204089.91149: Calling all_inventory to load vars for managed-node3 13830 1727204089.91150: Calling groups_inventory to load vars for managed-node3 13830 1727204089.91152: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.91156: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.91157: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.91159: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.91870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.92772: done with get_vars() 13830 1727204089.92794: done getting variables 13830 1727204089.92833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.056) 0:00:23.006 ***** 13830 1727204089.92863: entering _queue_task() for managed-node3/set_fact 13830 1727204089.93185: worker is 1 (out of 1 available) 13830 1727204089.93198: exiting _queue_task() for managed-node3/set_fact 13830 1727204089.93211: done queuing things up, now waiting for results queue to drain 13830 1727204089.93213: waiting for pending results... 13830 1727204089.93499: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13830 1727204089.93618: in run() - task 0affcd87-79f5-1659-6b02-0000000005b4 13830 1727204089.93635: variable 'ansible_search_path' from source: unknown 13830 1727204089.93638: variable 'ansible_search_path' from source: unknown 13830 1727204089.93674: calling self._execute() 13830 1727204089.93760: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.93772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.93784: variable 'omit' from source: magic vars 13830 1727204089.94151: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.94167: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.94172: variable 'omit' from source: magic vars 13830 1727204089.94242: variable 'omit' from source: magic vars 13830 1727204089.94277: variable 'omit' from source: magic vars 13830 1727204089.94317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204089.94366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204089.94379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204089.94392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.94402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.94429: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204089.94434: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.94437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.94509: Set connection var ansible_connection to ssh 13830 1727204089.94520: Set connection var ansible_timeout to 10 13830 1727204089.94523: Set connection var ansible_shell_executable to /bin/sh 13830 1727204089.94529: Set connection var ansible_shell_type to sh 13830 1727204089.94534: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204089.94541: Set connection var ansible_pipelining to False 13830 1727204089.94559: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.94562: variable 'ansible_connection' from source: unknown 13830 1727204089.94567: variable 'ansible_module_compression' from source: unknown 13830 1727204089.94569: variable 'ansible_shell_type' from source: unknown 13830 1727204089.94572: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.94574: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.94576: variable 'ansible_pipelining' from source: unknown 13830 1727204089.94578: variable 'ansible_timeout' from source: unknown 13830 1727204089.94583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.94685: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204089.94694: variable 'omit' from source: magic vars 13830 1727204089.94699: starting attempt loop 13830 1727204089.94702: running the handler 13830 1727204089.94713: handler run complete 13830 1727204089.94723: attempt loop complete, returning result 13830 1727204089.94726: _execute() done 13830 1727204089.94728: dumping result to json 13830 1727204089.94734: done dumping result, returning 13830 1727204089.94737: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-1659-6b02-0000000005b4] 13830 1727204089.94745: sending task result for task 0affcd87-79f5-1659-6b02-0000000005b4 13830 1727204089.94822: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005b4 13830 1727204089.94825: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13830 1727204089.94913: no more pending results, returning what we have 13830 1727204089.94916: results queue empty 13830 1727204089.94917: checking for any_errors_fatal 13830 1727204089.94919: done checking for any_errors_fatal 13830 1727204089.94920: checking for max_fail_percentage 13830 1727204089.94921: done checking for max_fail_percentage 13830 1727204089.94922: checking to see if all hosts have failed and the running result is not ok 13830 1727204089.94923: done checking to see if all hosts have failed 13830 1727204089.94923: getting the remaining hosts for this loop 13830 1727204089.94925: done getting the remaining hosts for this loop 13830 1727204089.94929: getting the next task for host managed-node3 13830 1727204089.94939: done getting next task for host managed-node3 13830 1727204089.94942: ^ task is: TASK: Stat profile file 13830 1727204089.94948: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204089.94951: getting variables 13830 1727204089.94952: in VariableManager get_vars() 13830 1727204089.94987: Calling all_inventory to load vars for managed-node3 13830 1727204089.94990: Calling groups_inventory to load vars for managed-node3 13830 1727204089.94993: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204089.95002: Calling all_plugins_play to load vars for managed-node3 13830 1727204089.95004: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204089.95006: Calling groups_plugins_play to load vars for managed-node3 13830 1727204089.95782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204089.97456: done with get_vars() 13830 1727204089.97480: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.047) 0:00:23.053 ***** 13830 1727204089.97585: entering _queue_task() for managed-node3/stat 13830 1727204089.97890: worker is 1 (out of 1 available) 13830 1727204089.97903: exiting _queue_task() for managed-node3/stat 13830 1727204089.97915: done queuing things up, now waiting for results queue to drain 13830 1727204089.97917: waiting for pending results... 13830 1727204089.98203: running TaskExecutor() for managed-node3/TASK: Stat profile file 13830 1727204089.98336: in run() - task 0affcd87-79f5-1659-6b02-0000000005b5 13830 1727204089.98358: variable 'ansible_search_path' from source: unknown 13830 1727204089.98370: variable 'ansible_search_path' from source: unknown 13830 1727204089.98407: calling self._execute() 13830 1727204089.98510: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.98522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.98538: variable 'omit' from source: magic vars 13830 1727204089.98914: variable 'ansible_distribution_major_version' from source: facts 13830 1727204089.98934: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204089.98945: variable 'omit' from source: magic vars 13830 1727204089.99016: variable 'omit' from source: magic vars 13830 1727204089.99120: variable 'profile' from source: include params 13830 1727204089.99134: variable 'bond_port_profile' from source: include params 13830 1727204089.99202: variable 'bond_port_profile' from source: include params 13830 1727204089.99226: variable 'omit' from source: magic vars 13830 1727204089.99282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204089.99320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204089.99352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204089.99375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.99391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204089.99425: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204089.99436: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.99444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.99547: Set connection var ansible_connection to ssh 13830 1727204089.99568: Set connection var ansible_timeout to 10 13830 1727204089.99579: Set connection var ansible_shell_executable to /bin/sh 13830 1727204089.99585: Set connection var ansible_shell_type to sh 13830 1727204089.99593: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204089.99607: Set connection var ansible_pipelining to False 13830 1727204089.99635: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.99642: variable 'ansible_connection' from source: unknown 13830 1727204089.99649: variable 'ansible_module_compression' from source: unknown 13830 1727204089.99654: variable 'ansible_shell_type' from source: unknown 13830 1727204089.99660: variable 'ansible_shell_executable' from source: unknown 13830 1727204089.99671: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204089.99678: variable 'ansible_pipelining' from source: unknown 13830 1727204089.99684: variable 'ansible_timeout' from source: unknown 13830 1727204089.99691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204089.99915: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204089.99935: variable 'omit' from source: magic vars 13830 1727204089.99947: starting attempt loop 13830 1727204089.99953: running the handler 13830 1727204089.99972: _low_level_execute_command(): starting 13830 1727204089.99983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204090.00770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204090.00788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.00805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.00826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.00884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.00898: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204090.00913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.00936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204090.00949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204090.00961: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204090.00977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.00996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.01013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.01026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.01041: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204090.01057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.01141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.01169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.01188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.01275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.02957: stdout chunk (state=3): >>>/root <<< 13830 1727204090.03153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.03156: stdout chunk (state=3): >>><<< 13830 1727204090.03159: stderr chunk (state=3): >>><<< 13830 1727204090.03271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.03275: _low_level_execute_command(): starting 13830 1727204090.03278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044 `" && echo ansible-tmp-1727204090.0318313-15384-252251448459044="` echo /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044 `" ) && sleep 0' 13830 1727204090.03896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204090.03909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.03923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.03946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.03994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.04005: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204090.04020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.04039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204090.04055: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204090.04070: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204090.04084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.04097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.04111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.04122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.04135: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204090.04148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.04229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.04253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.04274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.04350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.06327: stdout chunk (state=3): >>>ansible-tmp-1727204090.0318313-15384-252251448459044=/root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044 <<< 13830 1727204090.06486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.06569: stderr chunk (state=3): >>><<< 13830 1727204090.06572: stdout chunk (state=3): >>><<< 13830 1727204090.06873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204090.0318313-15384-252251448459044=/root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.06877: variable 'ansible_module_compression' from source: unknown 13830 1727204090.06879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204090.06881: variable 'ansible_facts' from source: unknown 13830 1727204090.06883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044/AnsiballZ_stat.py 13830 1727204090.07014: Sending initial data 13830 1727204090.07017: Sent initial data (153 bytes) 13830 1727204090.08048: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204090.08063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.08084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.08101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.08148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.08159: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204090.08176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.08197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204090.08208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204090.08217: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204090.08228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.08244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.08258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.08271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.08282: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204090.08304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.08382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.08412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.08429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.08513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.10328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204090.10346: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204090.10391: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpvnhle7_7 /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044/AnsiballZ_stat.py <<< 13830 1727204090.10423: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204090.11761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.12043: stderr chunk (state=3): >>><<< 13830 1727204090.12046: stdout chunk (state=3): >>><<< 13830 1727204090.12048: done transferring module to remote 13830 1727204090.12050: _low_level_execute_command(): starting 13830 1727204090.12057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044/ /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044/AnsiballZ_stat.py && sleep 0' 13830 1727204090.12862: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204090.12881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.12896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.12921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.12968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.13043: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204090.13059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.13085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204090.13098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204090.13110: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204090.13123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.13146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.13167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.13181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.13194: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204090.13207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.13403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.13428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.13450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.13530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.15342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.15346: stdout chunk (state=3): >>><<< 13830 1727204090.15350: stderr chunk (state=3): >>><<< 13830 1727204090.15390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.15394: _low_level_execute_command(): starting 13830 1727204090.15396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044/AnsiballZ_stat.py && sleep 0' 13830 1727204090.16004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204090.16008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.16026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.16031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.16074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.16079: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204090.16103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.16106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204090.16108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204090.16114: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204090.16122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.16132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.16147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.16154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.16160: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204090.16173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.16252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.16273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.16277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.16359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.29377: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204090.30328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204090.30387: stderr chunk (state=3): >>><<< 13830 1727204090.30391: stdout chunk (state=3): >>><<< 13830 1727204090.30405: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204090.30432: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204090.30442: _low_level_execute_command(): starting 13830 1727204090.30447: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204090.0318313-15384-252251448459044/ > /dev/null 2>&1 && sleep 0' 13830 1727204090.30903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.30907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.30961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204090.30967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.30970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204090.30972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.31023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.31027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.31041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.31082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.32830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.32884: stderr chunk (state=3): >>><<< 13830 1727204090.32887: stdout chunk (state=3): >>><<< 13830 1727204090.32901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.32907: handler run complete 13830 1727204090.32924: attempt loop complete, returning result 13830 1727204090.32927: _execute() done 13830 1727204090.32929: dumping result to json 13830 1727204090.32931: done dumping result, returning 13830 1727204090.32943: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-1659-6b02-0000000005b5] 13830 1727204090.32951: sending task result for task 0affcd87-79f5-1659-6b02-0000000005b5 13830 1727204090.33042: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005b5 13830 1727204090.33045: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 13830 1727204090.33110: no more pending results, returning what we have 13830 1727204090.33114: results queue empty 13830 1727204090.33115: checking for any_errors_fatal 13830 1727204090.33123: done checking for any_errors_fatal 13830 1727204090.33123: checking for max_fail_percentage 13830 1727204090.33125: done checking for max_fail_percentage 13830 1727204090.33126: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.33126: done checking to see if all hosts have failed 13830 1727204090.33127: getting the remaining hosts for this loop 13830 1727204090.33129: done getting the remaining hosts for this loop 13830 1727204090.33136: getting the next task for host managed-node3 13830 1727204090.33143: done getting next task for host managed-node3 13830 1727204090.33146: ^ task is: TASK: Set NM profile exist flag based on the profile files 13830 1727204090.33151: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.33155: getting variables 13830 1727204090.33156: in VariableManager get_vars() 13830 1727204090.33193: Calling all_inventory to load vars for managed-node3 13830 1727204090.33196: Calling groups_inventory to load vars for managed-node3 13830 1727204090.33200: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.33211: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.33214: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.33216: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.34056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.34990: done with get_vars() 13830 1727204090.35008: done getting variables 13830 1727204090.35054: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.375) 0:00:23.428 ***** 13830 1727204090.35087: entering _queue_task() for managed-node3/set_fact 13830 1727204090.35316: worker is 1 (out of 1 available) 13830 1727204090.35330: exiting _queue_task() for managed-node3/set_fact 13830 1727204090.35343: done queuing things up, now waiting for results queue to drain 13830 1727204090.35345: waiting for pending results... 13830 1727204090.35520: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 13830 1727204090.35610: in run() - task 0affcd87-79f5-1659-6b02-0000000005b6 13830 1727204090.35625: variable 'ansible_search_path' from source: unknown 13830 1727204090.35628: variable 'ansible_search_path' from source: unknown 13830 1727204090.35659: calling self._execute() 13830 1727204090.35728: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.35732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.35743: variable 'omit' from source: magic vars 13830 1727204090.36015: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.36026: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.36115: variable 'profile_stat' from source: set_fact 13830 1727204090.36124: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204090.36139: when evaluation is False, skipping this task 13830 1727204090.36142: _execute() done 13830 1727204090.36145: dumping result to json 13830 1727204090.36147: done dumping result, returning 13830 1727204090.36153: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-1659-6b02-0000000005b6] 13830 1727204090.36159: sending task result for task 0affcd87-79f5-1659-6b02-0000000005b6 13830 1727204090.36244: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005b6 13830 1727204090.36246: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204090.36296: no more pending results, returning what we have 13830 1727204090.36301: results queue empty 13830 1727204090.36302: checking for any_errors_fatal 13830 1727204090.36309: done checking for any_errors_fatal 13830 1727204090.36310: checking for max_fail_percentage 13830 1727204090.36311: done checking for max_fail_percentage 13830 1727204090.36312: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.36313: done checking to see if all hosts have failed 13830 1727204090.36314: getting the remaining hosts for this loop 13830 1727204090.36315: done getting the remaining hosts for this loop 13830 1727204090.36319: getting the next task for host managed-node3 13830 1727204090.36326: done getting next task for host managed-node3 13830 1727204090.36328: ^ task is: TASK: Get NM profile info 13830 1727204090.36334: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.36337: getting variables 13830 1727204090.36339: in VariableManager get_vars() 13830 1727204090.36375: Calling all_inventory to load vars for managed-node3 13830 1727204090.36378: Calling groups_inventory to load vars for managed-node3 13830 1727204090.36380: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.36390: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.36392: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.36395: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.40509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.41416: done with get_vars() 13830 1727204090.41437: done getting variables 13830 1727204090.41478: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.064) 0:00:23.493 ***** 13830 1727204090.41500: entering _queue_task() for managed-node3/shell 13830 1727204090.41733: worker is 1 (out of 1 available) 13830 1727204090.41747: exiting _queue_task() for managed-node3/shell 13830 1727204090.41758: done queuing things up, now waiting for results queue to drain 13830 1727204090.41760: waiting for pending results... 13830 1727204090.41943: running TaskExecutor() for managed-node3/TASK: Get NM profile info 13830 1727204090.42056: in run() - task 0affcd87-79f5-1659-6b02-0000000005b7 13830 1727204090.42068: variable 'ansible_search_path' from source: unknown 13830 1727204090.42072: variable 'ansible_search_path' from source: unknown 13830 1727204090.42101: calling self._execute() 13830 1727204090.42175: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.42179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.42187: variable 'omit' from source: magic vars 13830 1727204090.42468: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.42479: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.42485: variable 'omit' from source: magic vars 13830 1727204090.42528: variable 'omit' from source: magic vars 13830 1727204090.42600: variable 'profile' from source: include params 13830 1727204090.42605: variable 'bond_port_profile' from source: include params 13830 1727204090.42653: variable 'bond_port_profile' from source: include params 13830 1727204090.42669: variable 'omit' from source: magic vars 13830 1727204090.42704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204090.42730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204090.42749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204090.42787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204090.42794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204090.42826: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204090.42830: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.42832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.42907: Set connection var ansible_connection to ssh 13830 1727204090.42917: Set connection var ansible_timeout to 10 13830 1727204090.42921: Set connection var ansible_shell_executable to /bin/sh 13830 1727204090.42924: Set connection var ansible_shell_type to sh 13830 1727204090.42929: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204090.42939: Set connection var ansible_pipelining to False 13830 1727204090.42957: variable 'ansible_shell_executable' from source: unknown 13830 1727204090.42960: variable 'ansible_connection' from source: unknown 13830 1727204090.42962: variable 'ansible_module_compression' from source: unknown 13830 1727204090.42967: variable 'ansible_shell_type' from source: unknown 13830 1727204090.42970: variable 'ansible_shell_executable' from source: unknown 13830 1727204090.42972: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.42975: variable 'ansible_pipelining' from source: unknown 13830 1727204090.42977: variable 'ansible_timeout' from source: unknown 13830 1727204090.42980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.43085: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204090.43095: variable 'omit' from source: magic vars 13830 1727204090.43100: starting attempt loop 13830 1727204090.43104: running the handler 13830 1727204090.43113: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204090.43128: _low_level_execute_command(): starting 13830 1727204090.43137: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204090.43672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.43682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.43710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.43723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.43781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.43794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.43844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.45376: stdout chunk (state=3): >>>/root <<< 13830 1727204090.45479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.45541: stderr chunk (state=3): >>><<< 13830 1727204090.45544: stdout chunk (state=3): >>><<< 13830 1727204090.45568: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.45578: _low_level_execute_command(): starting 13830 1727204090.45584: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112 `" && echo ansible-tmp-1727204090.4556608-15409-126927415347112="` echo /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112 `" ) && sleep 0' 13830 1727204090.46039: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.46052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.46081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204090.46094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.46140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.46153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.46167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.46218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.48051: stdout chunk (state=3): >>>ansible-tmp-1727204090.4556608-15409-126927415347112=/root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112 <<< 13830 1727204090.48239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.48242: stdout chunk (state=3): >>><<< 13830 1727204090.48250: stderr chunk (state=3): >>><<< 13830 1727204090.48272: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204090.4556608-15409-126927415347112=/root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.48304: variable 'ansible_module_compression' from source: unknown 13830 1727204090.48371: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204090.48406: variable 'ansible_facts' from source: unknown 13830 1727204090.48506: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112/AnsiballZ_command.py 13830 1727204090.48667: Sending initial data 13830 1727204090.48671: Sent initial data (156 bytes) 13830 1727204090.49674: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204090.49682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.49692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.49704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.49743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.49750: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204090.49763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.49777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204090.49785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204090.49791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204090.49798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.49806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.49816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.49823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204090.49829: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204090.49841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.49910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.49925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.49928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.50004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.51708: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204090.51739: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204090.51782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpws8nds8v /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112/AnsiballZ_command.py <<< 13830 1727204090.51820: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204090.53068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.53177: stderr chunk (state=3): >>><<< 13830 1727204090.53181: stdout chunk (state=3): >>><<< 13830 1727204090.53183: done transferring module to remote 13830 1727204090.53185: _low_level_execute_command(): starting 13830 1727204090.53187: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112/ /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112/AnsiballZ_command.py && sleep 0' 13830 1727204090.53628: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.53631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.53670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.53677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.53680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.53722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.53725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.53780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.55471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.55549: stderr chunk (state=3): >>><<< 13830 1727204090.55552: stdout chunk (state=3): >>><<< 13830 1727204090.55668: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.55672: _low_level_execute_command(): starting 13830 1727204090.55675: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112/AnsiballZ_command.py && sleep 0' 13830 1727204090.56218: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.56224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.56261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.56281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204090.56284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.56336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.56341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.56394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.72593: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:50.700023", "end": "2024-09-24 14:54:50.724900", "delta": "0:00:00.024877", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204090.74006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204090.74066: stderr chunk (state=3): >>><<< 13830 1727204090.74070: stdout chunk (state=3): >>><<< 13830 1727204090.74089: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:50.700023", "end": "2024-09-24 14:54:50.724900", "delta": "0:00:00.024877", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204090.74120: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204090.74126: _low_level_execute_command(): starting 13830 1727204090.74133: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204090.4556608-15409-126927415347112/ > /dev/null 2>&1 && sleep 0' 13830 1727204090.74609: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204090.74612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.74649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204090.74652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204090.74655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204090.74710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204090.74714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204090.74717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204090.74771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204090.76620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204090.76678: stderr chunk (state=3): >>><<< 13830 1727204090.76681: stdout chunk (state=3): >>><<< 13830 1727204090.76697: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204090.76704: handler run complete 13830 1727204090.76721: Evaluated conditional (False): False 13830 1727204090.76729: attempt loop complete, returning result 13830 1727204090.76735: _execute() done 13830 1727204090.76737: dumping result to json 13830 1727204090.76739: done dumping result, returning 13830 1727204090.76746: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-1659-6b02-0000000005b7] 13830 1727204090.76751: sending task result for task 0affcd87-79f5-1659-6b02-0000000005b7 13830 1727204090.76855: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005b7 13830 1727204090.76858: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.024877", "end": "2024-09-24 14:54:50.724900", "rc": 0, "start": "2024-09-24 14:54:50.700023" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 13830 1727204090.76933: no more pending results, returning what we have 13830 1727204090.76937: results queue empty 13830 1727204090.76938: checking for any_errors_fatal 13830 1727204090.76946: done checking for any_errors_fatal 13830 1727204090.76947: checking for max_fail_percentage 13830 1727204090.76949: done checking for max_fail_percentage 13830 1727204090.76950: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.76951: done checking to see if all hosts have failed 13830 1727204090.76951: getting the remaining hosts for this loop 13830 1727204090.76953: done getting the remaining hosts for this loop 13830 1727204090.76957: getting the next task for host managed-node3 13830 1727204090.76966: done getting next task for host managed-node3 13830 1727204090.76969: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13830 1727204090.76975: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.76979: getting variables 13830 1727204090.76980: in VariableManager get_vars() 13830 1727204090.77011: Calling all_inventory to load vars for managed-node3 13830 1727204090.77013: Calling groups_inventory to load vars for managed-node3 13830 1727204090.77016: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.77026: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.77028: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.77033: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.77840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.78789: done with get_vars() 13830 1727204090.78807: done getting variables 13830 1727204090.78856: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.373) 0:00:23.866 ***** 13830 1727204090.78884: entering _queue_task() for managed-node3/set_fact 13830 1727204090.79120: worker is 1 (out of 1 available) 13830 1727204090.79135: exiting _queue_task() for managed-node3/set_fact 13830 1727204090.79146: done queuing things up, now waiting for results queue to drain 13830 1727204090.79148: waiting for pending results... 13830 1727204090.79327: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13830 1727204090.79424: in run() - task 0affcd87-79f5-1659-6b02-0000000005b8 13830 1727204090.79436: variable 'ansible_search_path' from source: unknown 13830 1727204090.79441: variable 'ansible_search_path' from source: unknown 13830 1727204090.79470: calling self._execute() 13830 1727204090.79543: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.79547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.79555: variable 'omit' from source: magic vars 13830 1727204090.79839: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.79850: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.79943: variable 'nm_profile_exists' from source: set_fact 13830 1727204090.79953: Evaluated conditional (nm_profile_exists.rc == 0): True 13830 1727204090.79958: variable 'omit' from source: magic vars 13830 1727204090.80000: variable 'omit' from source: magic vars 13830 1727204090.80025: variable 'omit' from source: magic vars 13830 1727204090.80068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204090.80096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204090.80113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204090.80125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204090.80140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204090.80166: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204090.80171: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.80174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.80242: Set connection var ansible_connection to ssh 13830 1727204090.80254: Set connection var ansible_timeout to 10 13830 1727204090.80263: Set connection var ansible_shell_executable to /bin/sh 13830 1727204090.80268: Set connection var ansible_shell_type to sh 13830 1727204090.80273: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204090.80280: Set connection var ansible_pipelining to False 13830 1727204090.80297: variable 'ansible_shell_executable' from source: unknown 13830 1727204090.80300: variable 'ansible_connection' from source: unknown 13830 1727204090.80303: variable 'ansible_module_compression' from source: unknown 13830 1727204090.80305: variable 'ansible_shell_type' from source: unknown 13830 1727204090.80308: variable 'ansible_shell_executable' from source: unknown 13830 1727204090.80310: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.80312: variable 'ansible_pipelining' from source: unknown 13830 1727204090.80314: variable 'ansible_timeout' from source: unknown 13830 1727204090.80318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.80423: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204090.80433: variable 'omit' from source: magic vars 13830 1727204090.80437: starting attempt loop 13830 1727204090.80439: running the handler 13830 1727204090.80449: handler run complete 13830 1727204090.80459: attempt loop complete, returning result 13830 1727204090.80462: _execute() done 13830 1727204090.80466: dumping result to json 13830 1727204090.80468: done dumping result, returning 13830 1727204090.80474: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-1659-6b02-0000000005b8] 13830 1727204090.80479: sending task result for task 0affcd87-79f5-1659-6b02-0000000005b8 13830 1727204090.80569: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005b8 13830 1727204090.80572: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13830 1727204090.80637: no more pending results, returning what we have 13830 1727204090.80641: results queue empty 13830 1727204090.80642: checking for any_errors_fatal 13830 1727204090.80650: done checking for any_errors_fatal 13830 1727204090.80650: checking for max_fail_percentage 13830 1727204090.80652: done checking for max_fail_percentage 13830 1727204090.80653: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.80653: done checking to see if all hosts have failed 13830 1727204090.80654: getting the remaining hosts for this loop 13830 1727204090.80656: done getting the remaining hosts for this loop 13830 1727204090.80660: getting the next task for host managed-node3 13830 1727204090.80670: done getting next task for host managed-node3 13830 1727204090.80673: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13830 1727204090.80689: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.80693: getting variables 13830 1727204090.80694: in VariableManager get_vars() 13830 1727204090.80719: Calling all_inventory to load vars for managed-node3 13830 1727204090.80722: Calling groups_inventory to load vars for managed-node3 13830 1727204090.80724: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.80735: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.80737: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.80740: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.81669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.82604: done with get_vars() 13830 1727204090.82626: done getting variables 13830 1727204090.82673: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204090.82766: variable 'profile' from source: include params 13830 1727204090.82770: variable 'bond_port_profile' from source: include params 13830 1727204090.82812: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.039) 0:00:23.906 ***** 13830 1727204090.82839: entering _queue_task() for managed-node3/command 13830 1727204090.83078: worker is 1 (out of 1 available) 13830 1727204090.83092: exiting _queue_task() for managed-node3/command 13830 1727204090.83104: done queuing things up, now waiting for results queue to drain 13830 1727204090.83106: waiting for pending results... 13830 1727204090.83287: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 13830 1727204090.83380: in run() - task 0affcd87-79f5-1659-6b02-0000000005ba 13830 1727204090.83392: variable 'ansible_search_path' from source: unknown 13830 1727204090.83395: variable 'ansible_search_path' from source: unknown 13830 1727204090.83424: calling self._execute() 13830 1727204090.83497: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.83501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.83514: variable 'omit' from source: magic vars 13830 1727204090.83779: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.83789: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.83876: variable 'profile_stat' from source: set_fact 13830 1727204090.83887: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204090.83890: when evaluation is False, skipping this task 13830 1727204090.83892: _execute() done 13830 1727204090.83895: dumping result to json 13830 1727204090.83898: done dumping result, returning 13830 1727204090.83904: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affcd87-79f5-1659-6b02-0000000005ba] 13830 1727204090.83910: sending task result for task 0affcd87-79f5-1659-6b02-0000000005ba 13830 1727204090.84004: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005ba 13830 1727204090.84007: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204090.84058: no more pending results, returning what we have 13830 1727204090.84062: results queue empty 13830 1727204090.84062: checking for any_errors_fatal 13830 1727204090.84070: done checking for any_errors_fatal 13830 1727204090.84071: checking for max_fail_percentage 13830 1727204090.84073: done checking for max_fail_percentage 13830 1727204090.84073: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.84074: done checking to see if all hosts have failed 13830 1727204090.84081: getting the remaining hosts for this loop 13830 1727204090.84083: done getting the remaining hosts for this loop 13830 1727204090.84087: getting the next task for host managed-node3 13830 1727204090.84094: done getting next task for host managed-node3 13830 1727204090.84096: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13830 1727204090.84102: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.84105: getting variables 13830 1727204090.84106: in VariableManager get_vars() 13830 1727204090.84137: Calling all_inventory to load vars for managed-node3 13830 1727204090.84139: Calling groups_inventory to load vars for managed-node3 13830 1727204090.84142: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.84152: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.84154: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.84156: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.84960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.85901: done with get_vars() 13830 1727204090.85921: done getting variables 13830 1727204090.85969: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204090.86056: variable 'profile' from source: include params 13830 1727204090.86059: variable 'bond_port_profile' from source: include params 13830 1727204090.86101: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.032) 0:00:23.939 ***** 13830 1727204090.86126: entering _queue_task() for managed-node3/set_fact 13830 1727204090.86370: worker is 1 (out of 1 available) 13830 1727204090.86383: exiting _queue_task() for managed-node3/set_fact 13830 1727204090.86394: done queuing things up, now waiting for results queue to drain 13830 1727204090.86396: waiting for pending results... 13830 1727204090.86579: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 13830 1727204090.86673: in run() - task 0affcd87-79f5-1659-6b02-0000000005bb 13830 1727204090.86689: variable 'ansible_search_path' from source: unknown 13830 1727204090.86693: variable 'ansible_search_path' from source: unknown 13830 1727204090.86720: calling self._execute() 13830 1727204090.86795: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.86799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.86808: variable 'omit' from source: magic vars 13830 1727204090.87077: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.87088: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.87175: variable 'profile_stat' from source: set_fact 13830 1727204090.87185: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204090.87189: when evaluation is False, skipping this task 13830 1727204090.87191: _execute() done 13830 1727204090.87194: dumping result to json 13830 1727204090.87196: done dumping result, returning 13830 1727204090.87202: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affcd87-79f5-1659-6b02-0000000005bb] 13830 1727204090.87208: sending task result for task 0affcd87-79f5-1659-6b02-0000000005bb 13830 1727204090.87298: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005bb 13830 1727204090.87301: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204090.87355: no more pending results, returning what we have 13830 1727204090.87359: results queue empty 13830 1727204090.87360: checking for any_errors_fatal 13830 1727204090.87370: done checking for any_errors_fatal 13830 1727204090.87371: checking for max_fail_percentage 13830 1727204090.87373: done checking for max_fail_percentage 13830 1727204090.87374: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.87374: done checking to see if all hosts have failed 13830 1727204090.87375: getting the remaining hosts for this loop 13830 1727204090.87377: done getting the remaining hosts for this loop 13830 1727204090.87380: getting the next task for host managed-node3 13830 1727204090.87388: done getting next task for host managed-node3 13830 1727204090.87390: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13830 1727204090.87396: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.87399: getting variables 13830 1727204090.87401: in VariableManager get_vars() 13830 1727204090.87428: Calling all_inventory to load vars for managed-node3 13830 1727204090.87433: Calling groups_inventory to load vars for managed-node3 13830 1727204090.87436: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.87451: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.87454: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.87457: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.88342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.89280: done with get_vars() 13830 1727204090.89299: done getting variables 13830 1727204090.89346: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204090.89433: variable 'profile' from source: include params 13830 1727204090.89437: variable 'bond_port_profile' from source: include params 13830 1727204090.89478: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.033) 0:00:23.973 ***** 13830 1727204090.89505: entering _queue_task() for managed-node3/command 13830 1727204090.89746: worker is 1 (out of 1 available) 13830 1727204090.89762: exiting _queue_task() for managed-node3/command 13830 1727204090.89776: done queuing things up, now waiting for results queue to drain 13830 1727204090.89779: waiting for pending results... 13830 1727204090.89955: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 13830 1727204090.90054: in run() - task 0affcd87-79f5-1659-6b02-0000000005bc 13830 1727204090.90072: variable 'ansible_search_path' from source: unknown 13830 1727204090.90077: variable 'ansible_search_path' from source: unknown 13830 1727204090.90099: calling self._execute() 13830 1727204090.90175: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.90179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.90187: variable 'omit' from source: magic vars 13830 1727204090.90453: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.90465: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.90550: variable 'profile_stat' from source: set_fact 13830 1727204090.90559: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204090.90562: when evaluation is False, skipping this task 13830 1727204090.90566: _execute() done 13830 1727204090.90569: dumping result to json 13830 1727204090.90571: done dumping result, returning 13830 1727204090.90580: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affcd87-79f5-1659-6b02-0000000005bc] 13830 1727204090.90583: sending task result for task 0affcd87-79f5-1659-6b02-0000000005bc 13830 1727204090.90675: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005bc 13830 1727204090.90677: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204090.90732: no more pending results, returning what we have 13830 1727204090.90737: results queue empty 13830 1727204090.90738: checking for any_errors_fatal 13830 1727204090.90743: done checking for any_errors_fatal 13830 1727204090.90744: checking for max_fail_percentage 13830 1727204090.90745: done checking for max_fail_percentage 13830 1727204090.90746: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.90747: done checking to see if all hosts have failed 13830 1727204090.90748: getting the remaining hosts for this loop 13830 1727204090.90749: done getting the remaining hosts for this loop 13830 1727204090.90753: getting the next task for host managed-node3 13830 1727204090.90760: done getting next task for host managed-node3 13830 1727204090.90763: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13830 1727204090.90770: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.90773: getting variables 13830 1727204090.90778: in VariableManager get_vars() 13830 1727204090.90808: Calling all_inventory to load vars for managed-node3 13830 1727204090.90811: Calling groups_inventory to load vars for managed-node3 13830 1727204090.90814: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.90823: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.90825: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.90828: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.91619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.92667: done with get_vars() 13830 1727204090.92684: done getting variables 13830 1727204090.92729: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204090.92816: variable 'profile' from source: include params 13830 1727204090.92819: variable 'bond_port_profile' from source: include params 13830 1727204090.92863: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.033) 0:00:24.007 ***** 13830 1727204090.92889: entering _queue_task() for managed-node3/set_fact 13830 1727204090.93133: worker is 1 (out of 1 available) 13830 1727204090.93146: exiting _queue_task() for managed-node3/set_fact 13830 1727204090.93158: done queuing things up, now waiting for results queue to drain 13830 1727204090.93160: waiting for pending results... 13830 1727204090.93339: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 13830 1727204090.93446: in run() - task 0affcd87-79f5-1659-6b02-0000000005bd 13830 1727204090.93456: variable 'ansible_search_path' from source: unknown 13830 1727204090.93459: variable 'ansible_search_path' from source: unknown 13830 1727204090.93492: calling self._execute() 13830 1727204090.93570: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.93574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.93581: variable 'omit' from source: magic vars 13830 1727204090.93855: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.93867: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.93954: variable 'profile_stat' from source: set_fact 13830 1727204090.93966: Evaluated conditional (profile_stat.stat.exists): False 13830 1727204090.93969: when evaluation is False, skipping this task 13830 1727204090.93972: _execute() done 13830 1727204090.93975: dumping result to json 13830 1727204090.93977: done dumping result, returning 13830 1727204090.93983: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affcd87-79f5-1659-6b02-0000000005bd] 13830 1727204090.93989: sending task result for task 0affcd87-79f5-1659-6b02-0000000005bd 13830 1727204090.94084: done sending task result for task 0affcd87-79f5-1659-6b02-0000000005bd 13830 1727204090.94087: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13830 1727204090.94141: no more pending results, returning what we have 13830 1727204090.94146: results queue empty 13830 1727204090.94146: checking for any_errors_fatal 13830 1727204090.94154: done checking for any_errors_fatal 13830 1727204090.94155: checking for max_fail_percentage 13830 1727204090.94156: done checking for max_fail_percentage 13830 1727204090.94157: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.94158: done checking to see if all hosts have failed 13830 1727204090.94158: getting the remaining hosts for this loop 13830 1727204090.94160: done getting the remaining hosts for this loop 13830 1727204090.94165: getting the next task for host managed-node3 13830 1727204090.94175: done getting next task for host managed-node3 13830 1727204090.94178: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13830 1727204090.94183: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.94186: getting variables 13830 1727204090.94188: in VariableManager get_vars() 13830 1727204090.94225: Calling all_inventory to load vars for managed-node3 13830 1727204090.94228: Calling groups_inventory to load vars for managed-node3 13830 1727204090.94233: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.94243: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.94245: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.94248: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.95065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204090.96014: done with get_vars() 13830 1727204090.96034: done getting variables 13830 1727204090.96085: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204090.96178: variable 'profile' from source: include params 13830 1727204090.96181: variable 'bond_port_profile' from source: include params 13830 1727204090.96221: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.033) 0:00:24.040 ***** 13830 1727204090.96249: entering _queue_task() for managed-node3/assert 13830 1727204090.96504: worker is 1 (out of 1 available) 13830 1727204090.96517: exiting _queue_task() for managed-node3/assert 13830 1727204090.96535: done queuing things up, now waiting for results queue to drain 13830 1727204090.96537: waiting for pending results... 13830 1727204090.96716: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.1' 13830 1727204090.96812: in run() - task 0affcd87-79f5-1659-6b02-0000000004e8 13830 1727204090.96819: variable 'ansible_search_path' from source: unknown 13830 1727204090.96822: variable 'ansible_search_path' from source: unknown 13830 1727204090.96851: calling self._execute() 13830 1727204090.96925: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.96929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.96938: variable 'omit' from source: magic vars 13830 1727204090.97205: variable 'ansible_distribution_major_version' from source: facts 13830 1727204090.97216: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204090.97222: variable 'omit' from source: magic vars 13830 1727204090.97263: variable 'omit' from source: magic vars 13830 1727204090.97334: variable 'profile' from source: include params 13830 1727204090.97337: variable 'bond_port_profile' from source: include params 13830 1727204090.97383: variable 'bond_port_profile' from source: include params 13830 1727204090.97398: variable 'omit' from source: magic vars 13830 1727204090.97436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204090.97467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204090.97484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204090.97497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204090.97507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204090.97533: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204090.97537: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.97539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.97611: Set connection var ansible_connection to ssh 13830 1727204090.97620: Set connection var ansible_timeout to 10 13830 1727204090.97626: Set connection var ansible_shell_executable to /bin/sh 13830 1727204090.97629: Set connection var ansible_shell_type to sh 13830 1727204090.97634: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204090.97642: Set connection var ansible_pipelining to False 13830 1727204090.97658: variable 'ansible_shell_executable' from source: unknown 13830 1727204090.97662: variable 'ansible_connection' from source: unknown 13830 1727204090.97666: variable 'ansible_module_compression' from source: unknown 13830 1727204090.97669: variable 'ansible_shell_type' from source: unknown 13830 1727204090.97671: variable 'ansible_shell_executable' from source: unknown 13830 1727204090.97673: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204090.97677: variable 'ansible_pipelining' from source: unknown 13830 1727204090.97680: variable 'ansible_timeout' from source: unknown 13830 1727204090.97682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204090.97785: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204090.97795: variable 'omit' from source: magic vars 13830 1727204090.97805: starting attempt loop 13830 1727204090.97808: running the handler 13830 1727204090.97889: variable 'lsr_net_profile_exists' from source: set_fact 13830 1727204090.97892: Evaluated conditional (lsr_net_profile_exists): True 13830 1727204090.97899: handler run complete 13830 1727204090.97914: attempt loop complete, returning result 13830 1727204090.97917: _execute() done 13830 1727204090.97921: dumping result to json 13830 1727204090.97923: done dumping result, returning 13830 1727204090.97928: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.1' [0affcd87-79f5-1659-6b02-0000000004e8] 13830 1727204090.97935: sending task result for task 0affcd87-79f5-1659-6b02-0000000004e8 13830 1727204090.98022: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004e8 13830 1727204090.98025: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204090.98076: no more pending results, returning what we have 13830 1727204090.98080: results queue empty 13830 1727204090.98081: checking for any_errors_fatal 13830 1727204090.98090: done checking for any_errors_fatal 13830 1727204090.98090: checking for max_fail_percentage 13830 1727204090.98092: done checking for max_fail_percentage 13830 1727204090.98093: checking to see if all hosts have failed and the running result is not ok 13830 1727204090.98094: done checking to see if all hosts have failed 13830 1727204090.98095: getting the remaining hosts for this loop 13830 1727204090.98096: done getting the remaining hosts for this loop 13830 1727204090.98100: getting the next task for host managed-node3 13830 1727204090.98107: done getting next task for host managed-node3 13830 1727204090.98110: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13830 1727204090.98115: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204090.98125: getting variables 13830 1727204090.98127: in VariableManager get_vars() 13830 1727204090.98161: Calling all_inventory to load vars for managed-node3 13830 1727204090.98166: Calling groups_inventory to load vars for managed-node3 13830 1727204090.98170: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204090.98178: Calling all_plugins_play to load vars for managed-node3 13830 1727204090.98181: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204090.98183: Calling groups_plugins_play to load vars for managed-node3 13830 1727204090.99160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204091.00084: done with get_vars() 13830 1727204091.00100: done getting variables 13830 1727204091.00146: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204091.00240: variable 'profile' from source: include params 13830 1727204091.00243: variable 'bond_port_profile' from source: include params 13830 1727204091.00284: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.040) 0:00:24.081 ***** 13830 1727204091.00310: entering _queue_task() for managed-node3/assert 13830 1727204091.00549: worker is 1 (out of 1 available) 13830 1727204091.00567: exiting _queue_task() for managed-node3/assert 13830 1727204091.00579: done queuing things up, now waiting for results queue to drain 13830 1727204091.00581: waiting for pending results... 13830 1727204091.00754: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 13830 1727204091.00853: in run() - task 0affcd87-79f5-1659-6b02-0000000004e9 13830 1727204091.00867: variable 'ansible_search_path' from source: unknown 13830 1727204091.00871: variable 'ansible_search_path' from source: unknown 13830 1727204091.00898: calling self._execute() 13830 1727204091.00975: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.00979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.00987: variable 'omit' from source: magic vars 13830 1727204091.01258: variable 'ansible_distribution_major_version' from source: facts 13830 1727204091.01272: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204091.01276: variable 'omit' from source: magic vars 13830 1727204091.01315: variable 'omit' from source: magic vars 13830 1727204091.01387: variable 'profile' from source: include params 13830 1727204091.01391: variable 'bond_port_profile' from source: include params 13830 1727204091.01437: variable 'bond_port_profile' from source: include params 13830 1727204091.01451: variable 'omit' from source: magic vars 13830 1727204091.01489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204091.01519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204091.01538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204091.01551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.01561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.01588: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204091.01594: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.01596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.01668: Set connection var ansible_connection to ssh 13830 1727204091.01681: Set connection var ansible_timeout to 10 13830 1727204091.01686: Set connection var ansible_shell_executable to /bin/sh 13830 1727204091.01689: Set connection var ansible_shell_type to sh 13830 1727204091.01696: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204091.01706: Set connection var ansible_pipelining to False 13830 1727204091.01725: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.01728: variable 'ansible_connection' from source: unknown 13830 1727204091.01734: variable 'ansible_module_compression' from source: unknown 13830 1727204091.01736: variable 'ansible_shell_type' from source: unknown 13830 1727204091.01738: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.01741: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.01746: variable 'ansible_pipelining' from source: unknown 13830 1727204091.01748: variable 'ansible_timeout' from source: unknown 13830 1727204091.01751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.01857: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204091.01868: variable 'omit' from source: magic vars 13830 1727204091.01873: starting attempt loop 13830 1727204091.01877: running the handler 13830 1727204091.01954: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13830 1727204091.01957: Evaluated conditional (lsr_net_profile_ansible_managed): True 13830 1727204091.01965: handler run complete 13830 1727204091.01975: attempt loop complete, returning result 13830 1727204091.01978: _execute() done 13830 1727204091.01981: dumping result to json 13830 1727204091.01983: done dumping result, returning 13830 1727204091.01989: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affcd87-79f5-1659-6b02-0000000004e9] 13830 1727204091.01994: sending task result for task 0affcd87-79f5-1659-6b02-0000000004e9 13830 1727204091.02088: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004e9 13830 1727204091.02090: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204091.02167: no more pending results, returning what we have 13830 1727204091.02171: results queue empty 13830 1727204091.02172: checking for any_errors_fatal 13830 1727204091.02177: done checking for any_errors_fatal 13830 1727204091.02178: checking for max_fail_percentage 13830 1727204091.02179: done checking for max_fail_percentage 13830 1727204091.02180: checking to see if all hosts have failed and the running result is not ok 13830 1727204091.02181: done checking to see if all hosts have failed 13830 1727204091.02182: getting the remaining hosts for this loop 13830 1727204091.02184: done getting the remaining hosts for this loop 13830 1727204091.02187: getting the next task for host managed-node3 13830 1727204091.02194: done getting next task for host managed-node3 13830 1727204091.02196: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13830 1727204091.02200: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204091.02204: getting variables 13830 1727204091.02205: in VariableManager get_vars() 13830 1727204091.02237: Calling all_inventory to load vars for managed-node3 13830 1727204091.02240: Calling groups_inventory to load vars for managed-node3 13830 1727204091.02243: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204091.02252: Calling all_plugins_play to load vars for managed-node3 13830 1727204091.02254: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204091.02256: Calling groups_plugins_play to load vars for managed-node3 13830 1727204091.03071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204091.04015: done with get_vars() 13830 1727204091.04038: done getting variables 13830 1727204091.04086: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204091.04178: variable 'profile' from source: include params 13830 1727204091.04181: variable 'bond_port_profile' from source: include params 13830 1727204091.04222: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.039) 0:00:24.120 ***** 13830 1727204091.04249: entering _queue_task() for managed-node3/assert 13830 1727204091.04493: worker is 1 (out of 1 available) 13830 1727204091.04509: exiting _queue_task() for managed-node3/assert 13830 1727204091.04522: done queuing things up, now waiting for results queue to drain 13830 1727204091.04524: waiting for pending results... 13830 1727204091.04705: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.1 13830 1727204091.04797: in run() - task 0affcd87-79f5-1659-6b02-0000000004ea 13830 1727204091.04810: variable 'ansible_search_path' from source: unknown 13830 1727204091.04814: variable 'ansible_search_path' from source: unknown 13830 1727204091.04844: calling self._execute() 13830 1727204091.04917: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.04921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.04930: variable 'omit' from source: magic vars 13830 1727204091.05200: variable 'ansible_distribution_major_version' from source: facts 13830 1727204091.05211: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204091.05217: variable 'omit' from source: magic vars 13830 1727204091.05253: variable 'omit' from source: magic vars 13830 1727204091.05323: variable 'profile' from source: include params 13830 1727204091.05327: variable 'bond_port_profile' from source: include params 13830 1727204091.05374: variable 'bond_port_profile' from source: include params 13830 1727204091.05390: variable 'omit' from source: magic vars 13830 1727204091.05427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204091.05455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204091.05475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204091.05488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.05501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.05534: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204091.05537: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.05540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.05606: Set connection var ansible_connection to ssh 13830 1727204091.05618: Set connection var ansible_timeout to 10 13830 1727204091.05624: Set connection var ansible_shell_executable to /bin/sh 13830 1727204091.05627: Set connection var ansible_shell_type to sh 13830 1727204091.05634: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204091.05640: Set connection var ansible_pipelining to False 13830 1727204091.05657: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.05660: variable 'ansible_connection' from source: unknown 13830 1727204091.05662: variable 'ansible_module_compression' from source: unknown 13830 1727204091.05668: variable 'ansible_shell_type' from source: unknown 13830 1727204091.05671: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.05673: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.05675: variable 'ansible_pipelining' from source: unknown 13830 1727204091.05677: variable 'ansible_timeout' from source: unknown 13830 1727204091.05679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.05782: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204091.05791: variable 'omit' from source: magic vars 13830 1727204091.05797: starting attempt loop 13830 1727204091.05799: running the handler 13830 1727204091.05876: variable 'lsr_net_profile_fingerprint' from source: set_fact 13830 1727204091.05879: Evaluated conditional (lsr_net_profile_fingerprint): True 13830 1727204091.05886: handler run complete 13830 1727204091.05897: attempt loop complete, returning result 13830 1727204091.05899: _execute() done 13830 1727204091.05902: dumping result to json 13830 1727204091.05904: done dumping result, returning 13830 1727204091.05910: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0affcd87-79f5-1659-6b02-0000000004ea] 13830 1727204091.05915: sending task result for task 0affcd87-79f5-1659-6b02-0000000004ea 13830 1727204091.06004: done sending task result for task 0affcd87-79f5-1659-6b02-0000000004ea 13830 1727204091.06007: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204091.06067: no more pending results, returning what we have 13830 1727204091.06071: results queue empty 13830 1727204091.06072: checking for any_errors_fatal 13830 1727204091.06079: done checking for any_errors_fatal 13830 1727204091.06080: checking for max_fail_percentage 13830 1727204091.06081: done checking for max_fail_percentage 13830 1727204091.06083: checking to see if all hosts have failed and the running result is not ok 13830 1727204091.06083: done checking to see if all hosts have failed 13830 1727204091.06084: getting the remaining hosts for this loop 13830 1727204091.06085: done getting the remaining hosts for this loop 13830 1727204091.06089: getting the next task for host managed-node3 13830 1727204091.06100: done getting next task for host managed-node3 13830 1727204091.06102: ^ task is: TASK: ** TEST check bond settings 13830 1727204091.06106: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204091.06110: getting variables 13830 1727204091.06111: in VariableManager get_vars() 13830 1727204091.06149: Calling all_inventory to load vars for managed-node3 13830 1727204091.06155: Calling groups_inventory to load vars for managed-node3 13830 1727204091.06158: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204091.06169: Calling all_plugins_play to load vars for managed-node3 13830 1727204091.06171: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204091.06174: Calling groups_plugins_play to load vars for managed-node3 13830 1727204091.07159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204091.08267: done with get_vars() 13830 1727204091.08295: done getting variables 13830 1727204091.08361: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.041) 0:00:24.162 ***** 13830 1727204091.08402: entering _queue_task() for managed-node3/command 13830 1727204091.08751: worker is 1 (out of 1 available) 13830 1727204091.08767: exiting _queue_task() for managed-node3/command 13830 1727204091.08779: done queuing things up, now waiting for results queue to drain 13830 1727204091.08781: waiting for pending results... 13830 1727204091.09089: running TaskExecutor() for managed-node3/TASK: ** TEST check bond settings 13830 1727204091.09217: in run() - task 0affcd87-79f5-1659-6b02-000000000400 13830 1727204091.09244: variable 'ansible_search_path' from source: unknown 13830 1727204091.09253: variable 'ansible_search_path' from source: unknown 13830 1727204091.09304: variable 'bond_options_to_assert' from source: play vars 13830 1727204091.09521: variable 'bond_options_to_assert' from source: play vars 13830 1727204091.09737: variable 'omit' from source: magic vars 13830 1727204091.09885: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.09899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.09914: variable 'omit' from source: magic vars 13830 1727204091.10171: variable 'ansible_distribution_major_version' from source: facts 13830 1727204091.10187: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204091.10201: variable 'omit' from source: magic vars 13830 1727204091.10260: variable 'omit' from source: magic vars 13830 1727204091.10478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204091.12669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204091.12752: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204091.12799: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204091.12842: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204091.12876: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204091.12991: variable 'controller_device' from source: play vars 13830 1727204091.13000: variable 'bond_opt' from source: unknown 13830 1727204091.13024: variable 'omit' from source: magic vars 13830 1727204091.13058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204091.13092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204091.13116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204091.13140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.13153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.13188: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204091.13196: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.13204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.13297: Set connection var ansible_connection to ssh 13830 1727204091.13313: Set connection var ansible_timeout to 10 13830 1727204091.13322: Set connection var ansible_shell_executable to /bin/sh 13830 1727204091.13328: Set connection var ansible_shell_type to sh 13830 1727204091.13339: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204091.13352: Set connection var ansible_pipelining to False 13830 1727204091.13380: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.13387: variable 'ansible_connection' from source: unknown 13830 1727204091.13393: variable 'ansible_module_compression' from source: unknown 13830 1727204091.13399: variable 'ansible_shell_type' from source: unknown 13830 1727204091.13404: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.13413: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.13459: variable 'ansible_pipelining' from source: unknown 13830 1727204091.13469: variable 'ansible_timeout' from source: unknown 13830 1727204091.13476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.13574: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204091.13589: variable 'omit' from source: magic vars 13830 1727204091.13597: starting attempt loop 13830 1727204091.13602: running the handler 13830 1727204091.13620: _low_level_execute_command(): starting 13830 1727204091.13633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204091.14351: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204091.14371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.14389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.14409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.14455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.14469: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204091.14483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.14500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204091.14511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204091.14522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204091.14537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.14550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.14566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.14578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.14589: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204091.14601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.14681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.14704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.14719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.14801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.16492: stdout chunk (state=3): >>>/root <<< 13830 1727204091.16688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.16692: stdout chunk (state=3): >>><<< 13830 1727204091.16694: stderr chunk (state=3): >>><<< 13830 1727204091.16812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.16824: _low_level_execute_command(): starting 13830 1727204091.16827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108 `" && echo ansible-tmp-1727204091.1671686-15434-46393107035108="` echo /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108 `" ) && sleep 0' 13830 1727204091.17835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204091.18316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.18333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.18350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.18398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.18583: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204091.18598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.18614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204091.18625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204091.18635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204091.18648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.18660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.18679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.18691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.18702: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204091.18717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.18796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.18822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.18838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.18923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.20952: stdout chunk (state=3): >>>ansible-tmp-1727204091.1671686-15434-46393107035108=/root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108 <<< 13830 1727204091.21152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.21155: stdout chunk (state=3): >>><<< 13830 1727204091.21158: stderr chunk (state=3): >>><<< 13830 1727204091.21469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204091.1671686-15434-46393107035108=/root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.21473: variable 'ansible_module_compression' from source: unknown 13830 1727204091.21476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204091.21478: variable 'ansible_facts' from source: unknown 13830 1727204091.21480: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108/AnsiballZ_command.py 13830 1727204091.21942: Sending initial data 13830 1727204091.21952: Sent initial data (155 bytes) 13830 1727204091.23994: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.23999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.24025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.24029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.24031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204091.24041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.24117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.24152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.24318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.26071: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204091.26099: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204091.26142: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpo727ergz /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108/AnsiballZ_command.py <<< 13830 1727204091.26181: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204091.27311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.27471: stderr chunk (state=3): >>><<< 13830 1727204091.27474: stdout chunk (state=3): >>><<< 13830 1727204091.27589: done transferring module to remote 13830 1727204091.27592: _low_level_execute_command(): starting 13830 1727204091.27595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108/ /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108/AnsiballZ_command.py && sleep 0' 13830 1727204091.28402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.28406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.28449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.28453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.28469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.28476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.28489: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204091.28494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.28572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.28597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.28662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.30421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.30426: stdout chunk (state=3): >>><<< 13830 1727204091.30430: stderr chunk (state=3): >>><<< 13830 1727204091.30453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.30457: _low_level_execute_command(): starting 13830 1727204091.30459: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108/AnsiballZ_command.py && sleep 0' 13830 1727204091.31496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204091.31504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.31513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.31527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.31571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.31578: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204091.31588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.31601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204091.31610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204091.31616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204091.31624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.31635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.31647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.31654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.31660: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204091.31670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.31746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.31767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.31779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.31857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.45561: stdout chunk (state=3): >>> {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:54:51.451244", "end": "2024-09-24 14:54:51.454504", "delta": "0:00:00.003260", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204091.47014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204091.47019: stdout chunk (state=3): >>><<< 13830 1727204091.47022: stderr chunk (state=3): >>><<< 13830 1727204091.47075: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:54:51.451244", "end": "2024-09-24 14:54:51.454504", "delta": "0:00:00.003260", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204091.47113: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204091.47117: _low_level_execute_command(): starting 13830 1727204091.47120: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204091.1671686-15434-46393107035108/ > /dev/null 2>&1 && sleep 0' 13830 1727204091.47724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204091.47742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.47752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.47778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.47812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.47820: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204091.47830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.47851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204091.47859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204091.47867: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204091.47884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.47892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.47910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.47917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.47923: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204091.47936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.48019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.48034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.48041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.48122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.50017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.50071: stderr chunk (state=3): >>><<< 13830 1727204091.50075: stdout chunk (state=3): >>><<< 13830 1727204091.50092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.50097: handler run complete 13830 1727204091.50115: Evaluated conditional (False): False 13830 1727204091.50240: variable 'bond_opt' from source: unknown 13830 1727204091.50244: variable 'result' from source: unknown 13830 1727204091.50261: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204091.50271: attempt loop complete, returning result 13830 1727204091.50287: variable 'bond_opt' from source: unknown 13830 1727204091.50345: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'mode', 'value': '802.3ad'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "802.3ad" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003260", "end": "2024-09-24 14:54:51.454504", "rc": 0, "start": "2024-09-24 14:54:51.451244" } STDOUT: 802.3ad 4 13830 1727204091.50554: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.50557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.50560: variable 'omit' from source: magic vars 13830 1727204091.50620: variable 'ansible_distribution_major_version' from source: facts 13830 1727204091.50624: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204091.50629: variable 'omit' from source: magic vars 13830 1727204091.50644: variable 'omit' from source: magic vars 13830 1727204091.50789: variable 'controller_device' from source: play vars 13830 1727204091.50793: variable 'bond_opt' from source: unknown 13830 1727204091.50805: variable 'omit' from source: magic vars 13830 1727204091.50823: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204091.50830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.50839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.50850: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204091.50852: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.50855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.50909: Set connection var ansible_connection to ssh 13830 1727204091.50916: Set connection var ansible_timeout to 10 13830 1727204091.50926: Set connection var ansible_shell_executable to /bin/sh 13830 1727204091.50936: Set connection var ansible_shell_type to sh 13830 1727204091.50943: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204091.50952: Set connection var ansible_pipelining to False 13830 1727204091.50973: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.50976: variable 'ansible_connection' from source: unknown 13830 1727204091.50978: variable 'ansible_module_compression' from source: unknown 13830 1727204091.50981: variable 'ansible_shell_type' from source: unknown 13830 1727204091.50983: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.50985: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.50996: variable 'ansible_pipelining' from source: unknown 13830 1727204091.50999: variable 'ansible_timeout' from source: unknown 13830 1727204091.51004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.51082: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204091.51092: variable 'omit' from source: magic vars 13830 1727204091.51095: starting attempt loop 13830 1727204091.51097: running the handler 13830 1727204091.51106: _low_level_execute_command(): starting 13830 1727204091.51110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204091.51640: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.51645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.51669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.51675: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.51699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204091.51702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204091.51704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.51752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.51760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.51826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.53496: stdout chunk (state=3): >>>/root <<< 13830 1727204091.53600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.53658: stderr chunk (state=3): >>><<< 13830 1727204091.53661: stdout chunk (state=3): >>><<< 13830 1727204091.53678: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.53686: _low_level_execute_command(): starting 13830 1727204091.53692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112 `" && echo ansible-tmp-1727204091.5367851-15434-202565561436112="` echo /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112 `" ) && sleep 0' 13830 1727204091.54163: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.54169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.54201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.54213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.54265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.54278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.54333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.56281: stdout chunk (state=3): >>>ansible-tmp-1727204091.5367851-15434-202565561436112=/root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112 <<< 13830 1727204091.56397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.56460: stderr chunk (state=3): >>><<< 13830 1727204091.56465: stdout chunk (state=3): >>><<< 13830 1727204091.56481: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204091.5367851-15434-202565561436112=/root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.56502: variable 'ansible_module_compression' from source: unknown 13830 1727204091.56542: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204091.56557: variable 'ansible_facts' from source: unknown 13830 1727204091.56603: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112/AnsiballZ_command.py 13830 1727204091.56706: Sending initial data 13830 1727204091.56710: Sent initial data (156 bytes) 13830 1727204091.57424: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.57432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.57469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.57486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.57500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.57550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.57556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.57569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.57627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.59474: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204091.59509: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204091.59551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpgffarf8z /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112/AnsiballZ_command.py <<< 13830 1727204091.59584: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204091.60466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.60584: stderr chunk (state=3): >>><<< 13830 1727204091.60587: stdout chunk (state=3): >>><<< 13830 1727204091.60603: done transferring module to remote 13830 1727204091.60611: _low_level_execute_command(): starting 13830 1727204091.60615: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112/ /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112/AnsiballZ_command.py && sleep 0' 13830 1727204091.61081: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.61087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.61132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.61136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.61138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.61190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.61193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.61202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.61259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.63114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.63178: stderr chunk (state=3): >>><<< 13830 1727204091.63182: stdout chunk (state=3): >>><<< 13830 1727204091.63196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.63199: _low_level_execute_command(): starting 13830 1727204091.63204: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112/AnsiballZ_command.py && sleep 0' 13830 1727204091.63686: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.63690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.63725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.63741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.63790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.63802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.63862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.77152: stdout chunk (state=3): >>> {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-24 14:54:51.767505", "end": "2024-09-24 14:54:51.770571", "delta": "0:00:00.003066", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204091.78251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204091.78315: stderr chunk (state=3): >>><<< 13830 1727204091.78319: stdout chunk (state=3): >>><<< 13830 1727204091.78336: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-24 14:54:51.767505", "end": "2024-09-24 14:54:51.770571", "delta": "0:00:00.003066", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204091.78359: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204091.78366: _low_level_execute_command(): starting 13830 1727204091.78371: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204091.5367851-15434-202565561436112/ > /dev/null 2>&1 && sleep 0' 13830 1727204091.78861: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.78868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.78904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.78916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.78971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.78984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.79046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.80800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.80859: stderr chunk (state=3): >>><<< 13830 1727204091.80865: stdout chunk (state=3): >>><<< 13830 1727204091.80886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.80891: handler run complete 13830 1727204091.80908: Evaluated conditional (False): False 13830 1727204091.81026: variable 'bond_opt' from source: unknown 13830 1727204091.81030: variable 'result' from source: unknown 13830 1727204091.81043: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204091.81053: attempt loop complete, returning result 13830 1727204091.81069: variable 'bond_opt' from source: unknown 13830 1727204091.81122: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'ad_actor_sys_prio', 'value': '65535'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_sys_prio", "value": "65535" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio" ], "delta": "0:00:00.003066", "end": "2024-09-24 14:54:51.770571", "rc": 0, "start": "2024-09-24 14:54:51.767505" } STDOUT: 65535 13830 1727204091.81258: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.81261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.81265: variable 'omit' from source: magic vars 13830 1727204091.81355: variable 'ansible_distribution_major_version' from source: facts 13830 1727204091.81360: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204091.81362: variable 'omit' from source: magic vars 13830 1727204091.81382: variable 'omit' from source: magic vars 13830 1727204091.81494: variable 'controller_device' from source: play vars 13830 1727204091.81498: variable 'bond_opt' from source: unknown 13830 1727204091.81512: variable 'omit' from source: magic vars 13830 1727204091.81529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204091.81539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.81544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204091.81555: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204091.81557: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.81560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.81612: Set connection var ansible_connection to ssh 13830 1727204091.81619: Set connection var ansible_timeout to 10 13830 1727204091.81624: Set connection var ansible_shell_executable to /bin/sh 13830 1727204091.81626: Set connection var ansible_shell_type to sh 13830 1727204091.81631: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204091.81640: Set connection var ansible_pipelining to False 13830 1727204091.81655: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.81658: variable 'ansible_connection' from source: unknown 13830 1727204091.81660: variable 'ansible_module_compression' from source: unknown 13830 1727204091.81662: variable 'ansible_shell_type' from source: unknown 13830 1727204091.81666: variable 'ansible_shell_executable' from source: unknown 13830 1727204091.81668: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204091.81672: variable 'ansible_pipelining' from source: unknown 13830 1727204091.81675: variable 'ansible_timeout' from source: unknown 13830 1727204091.81679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204091.81747: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204091.81753: variable 'omit' from source: magic vars 13830 1727204091.81756: starting attempt loop 13830 1727204091.81758: running the handler 13830 1727204091.81766: _low_level_execute_command(): starting 13830 1727204091.81769: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204091.82248: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.82255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.82287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204091.82290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.82292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.82343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.82346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.82404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.83923: stdout chunk (state=3): >>>/root <<< 13830 1727204091.84024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.84088: stderr chunk (state=3): >>><<< 13830 1727204091.84091: stdout chunk (state=3): >>><<< 13830 1727204091.84107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.84115: _low_level_execute_command(): starting 13830 1727204091.84120: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624 `" && echo ansible-tmp-1727204091.8410714-15434-264528940809624="` echo /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624 `" ) && sleep 0' 13830 1727204091.84725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.84741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.84825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.86694: stdout chunk (state=3): >>>ansible-tmp-1727204091.8410714-15434-264528940809624=/root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624 <<< 13830 1727204091.86896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.86903: stdout chunk (state=3): >>><<< 13830 1727204091.86905: stderr chunk (state=3): >>><<< 13830 1727204091.86918: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204091.8410714-15434-264528940809624=/root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.86944: variable 'ansible_module_compression' from source: unknown 13830 1727204091.86992: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204091.87071: variable 'ansible_facts' from source: unknown 13830 1727204091.87079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624/AnsiballZ_command.py 13830 1727204091.87216: Sending initial data 13830 1727204091.87227: Sent initial data (156 bytes) 13830 1727204091.88166: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204091.88182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.88193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.88207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.88247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.88255: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204091.88265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.88281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204091.88290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204091.88293: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204091.88302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.88310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.88321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.88327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.88334: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204091.88343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.88426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.88429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.88442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.88500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.90341: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204091.90380: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204091.90420: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpd_oq6bsw /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624/AnsiballZ_command.py <<< 13830 1727204091.90452: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204091.91303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.91557: stderr chunk (state=3): >>><<< 13830 1727204091.91560: stdout chunk (state=3): >>><<< 13830 1727204091.91563: done transferring module to remote 13830 1727204091.91568: _low_level_execute_command(): starting 13830 1727204091.91570: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624/ /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624/AnsiballZ_command.py && sleep 0' 13830 1727204091.92396: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.92400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.92447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204091.92450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.92452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.92512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.92534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.92537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.92592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204091.94456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204091.94507: stderr chunk (state=3): >>><<< 13830 1727204091.94509: stdout chunk (state=3): >>><<< 13830 1727204091.94571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204091.94574: _low_level_execute_command(): starting 13830 1727204091.94577: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624/AnsiballZ_command.py && sleep 0' 13830 1727204091.94976: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.94982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.95011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204091.95025: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204091.95030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.95041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204091.95048: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204091.95053: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204091.95061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204091.95073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204091.95078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204091.95126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204091.95145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204091.95153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204091.95218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.09212: stdout chunk (state=3): >>> {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-24 14:54:52.087802", "end": "2024-09-24 14:54:52.091087", "delta": "0:00:00.003285", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204092.10572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204092.10578: stdout chunk (state=3): >>><<< 13830 1727204092.10580: stderr chunk (state=3): >>><<< 13830 1727204092.10721: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-24 14:54:52.087802", "end": "2024-09-24 14:54:52.091087", "delta": "0:00:00.003285", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204092.10725: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_system', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204092.10728: _low_level_execute_command(): starting 13830 1727204092.10730: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204091.8410714-15434-264528940809624/ > /dev/null 2>&1 && sleep 0' 13830 1727204092.11294: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.11309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.11323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.11340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.11388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.11402: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.11415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.11431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.11443: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.11453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.11466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.11479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.11493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.11508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.11517: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.11529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.11597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.11623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.11640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.11717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.13519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.13634: stderr chunk (state=3): >>><<< 13830 1727204092.13645: stdout chunk (state=3): >>><<< 13830 1727204092.13873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.13882: handler run complete 13830 1727204092.13885: Evaluated conditional (False): False 13830 1727204092.13887: variable 'bond_opt' from source: unknown 13830 1727204092.13889: variable 'result' from source: unknown 13830 1727204092.13906: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204092.13921: attempt loop complete, returning result 13830 1727204092.13943: variable 'bond_opt' from source: unknown 13830 1727204092.14041: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'ad_actor_system', 'value': '00:00:5e:00:53:5d'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_system", "value": "00:00:5e:00:53:5d" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_system" ], "delta": "0:00:00.003285", "end": "2024-09-24 14:54:52.091087", "rc": 0, "start": "2024-09-24 14:54:52.087802" } STDOUT: 00:00:5e:00:53:5d 13830 1727204092.14267: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204092.14290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204092.14308: variable 'omit' from source: magic vars 13830 1727204092.14473: variable 'ansible_distribution_major_version' from source: facts 13830 1727204092.14484: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204092.14491: variable 'omit' from source: magic vars 13830 1727204092.14509: variable 'omit' from source: magic vars 13830 1727204092.14707: variable 'controller_device' from source: play vars 13830 1727204092.14720: variable 'bond_opt' from source: unknown 13830 1727204092.14746: variable 'omit' from source: magic vars 13830 1727204092.14775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204092.14799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204092.14812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204092.14828: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204092.14835: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204092.14841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204092.14920: Set connection var ansible_connection to ssh 13830 1727204092.14936: Set connection var ansible_timeout to 10 13830 1727204092.14947: Set connection var ansible_shell_executable to /bin/sh 13830 1727204092.14973: Set connection var ansible_shell_type to sh 13830 1727204092.15004: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204092.15041: Set connection var ansible_pipelining to False 13830 1727204092.15086: variable 'ansible_shell_executable' from source: unknown 13830 1727204092.15094: variable 'ansible_connection' from source: unknown 13830 1727204092.15106: variable 'ansible_module_compression' from source: unknown 13830 1727204092.15115: variable 'ansible_shell_type' from source: unknown 13830 1727204092.15122: variable 'ansible_shell_executable' from source: unknown 13830 1727204092.15130: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204092.15151: variable 'ansible_pipelining' from source: unknown 13830 1727204092.15159: variable 'ansible_timeout' from source: unknown 13830 1727204092.15175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204092.15315: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204092.15336: variable 'omit' from source: magic vars 13830 1727204092.15347: starting attempt loop 13830 1727204092.15353: running the handler 13830 1727204092.15372: _low_level_execute_command(): starting 13830 1727204092.15381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204092.16126: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.16140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.16158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.16178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.16221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.16232: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.16246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.16264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.16278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.16289: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.16299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.16311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.16325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.16336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.16346: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.16361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.16450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.16484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.16508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.16587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.18106: stdout chunk (state=3): >>>/root <<< 13830 1727204092.18289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.18319: stderr chunk (state=3): >>><<< 13830 1727204092.18322: stdout chunk (state=3): >>><<< 13830 1727204092.18427: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.18430: _low_level_execute_command(): starting 13830 1727204092.18437: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885 `" && echo ansible-tmp-1727204092.1833825-15434-250869087857885="` echo /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885 `" ) && sleep 0' 13830 1727204092.19960: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.20086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.20103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.20121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.20173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.20281: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.20299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.20317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.20328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.20337: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.20347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.20357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.20373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.20382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.20392: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.20404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.20477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.20613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.20628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.20856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.22710: stdout chunk (state=3): >>>ansible-tmp-1727204092.1833825-15434-250869087857885=/root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885 <<< 13830 1727204092.22933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.22947: stdout chunk (state=3): >>><<< 13830 1727204092.22949: stderr chunk (state=3): >>><<< 13830 1727204092.23172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204092.1833825-15434-250869087857885=/root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.23176: variable 'ansible_module_compression' from source: unknown 13830 1727204092.23178: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204092.23180: variable 'ansible_facts' from source: unknown 13830 1727204092.23182: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885/AnsiballZ_command.py 13830 1727204092.23353: Sending initial data 13830 1727204092.23356: Sent initial data (156 bytes) 13830 1727204092.24844: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.24862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.24882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.24901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.24953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.24967: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.24980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.24994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.25004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.25012: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.25023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.25040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.25070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.25082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.25091: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.25102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.25181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.25199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.25223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.25311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.27044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204092.27081: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204092.27126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpy6daassm /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885/AnsiballZ_command.py <<< 13830 1727204092.27169: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204092.28589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.28772: stderr chunk (state=3): >>><<< 13830 1727204092.28776: stdout chunk (state=3): >>><<< 13830 1727204092.28947: done transferring module to remote 13830 1727204092.28951: _low_level_execute_command(): starting 13830 1727204092.28953: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885/ /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885/AnsiballZ_command.py && sleep 0' 13830 1727204092.30925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.30940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.30953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.30971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.31137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.31152: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.31169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.31190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.31203: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.31217: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.31233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.31247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.31262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.31277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.31288: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.31301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.31384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.31403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.31455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.31673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.33505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.33509: stdout chunk (state=3): >>><<< 13830 1727204092.33511: stderr chunk (state=3): >>><<< 13830 1727204092.33617: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.33621: _low_level_execute_command(): starting 13830 1727204092.33623: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885/AnsiballZ_command.py && sleep 0' 13830 1727204092.35157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.35203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.35219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.35236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.35282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.35380: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.35394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.35414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.35425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.35435: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.35446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.35458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.35475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.35486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.35495: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.35507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.35590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.35683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.35698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.35851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.49627: stdout chunk (state=3): >>> {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-24 14:54:52.492023", "end": "2024-09-24 14:54:52.495263", "delta": "0:00:00.003240", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204092.50995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204092.51088: stderr chunk (state=3): >>><<< 13830 1727204092.51092: stdout chunk (state=3): >>><<< 13830 1727204092.51226: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-24 14:54:52.492023", "end": "2024-09-24 14:54:52.495263", "delta": "0:00:00.003240", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204092.51235: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_select', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204092.51238: _low_level_execute_command(): starting 13830 1727204092.51240: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204092.1833825-15434-250869087857885/ > /dev/null 2>&1 && sleep 0' 13830 1727204092.53557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.53641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.53656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.53677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.53723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.53851: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.53868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.53886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.53898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.53910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.53923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.53938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.53960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.53977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.53989: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.54004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.54092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.54111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.54183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.54383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.56292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.56348: stderr chunk (state=3): >>><<< 13830 1727204092.56352: stdout chunk (state=3): >>><<< 13830 1727204092.56572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.56576: handler run complete 13830 1727204092.56579: Evaluated conditional (False): False 13830 1727204092.56581: variable 'bond_opt' from source: unknown 13830 1727204092.56583: variable 'result' from source: unknown 13830 1727204092.56595: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204092.56612: attempt loop complete, returning result 13830 1727204092.56637: variable 'bond_opt' from source: unknown 13830 1727204092.56717: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'ad_select', 'value': 'stable'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_select", "value": "stable" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_select" ], "delta": "0:00:00.003240", "end": "2024-09-24 14:54:52.495263", "rc": 0, "start": "2024-09-24 14:54:52.492023" } STDOUT: stable 0 13830 1727204092.56942: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204092.56955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204092.56970: variable 'omit' from source: magic vars 13830 1727204092.57138: variable 'ansible_distribution_major_version' from source: facts 13830 1727204092.57279: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204092.57288: variable 'omit' from source: magic vars 13830 1727204092.57323: variable 'omit' from source: magic vars 13830 1727204092.57889: variable 'controller_device' from source: play vars 13830 1727204092.57967: variable 'bond_opt' from source: unknown 13830 1727204092.57994: variable 'omit' from source: magic vars 13830 1727204092.58022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204092.58185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204092.58198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204092.58291: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204092.58299: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204092.58307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204092.58615: Set connection var ansible_connection to ssh 13830 1727204092.58631: Set connection var ansible_timeout to 10 13830 1727204092.58642: Set connection var ansible_shell_executable to /bin/sh 13830 1727204092.58648: Set connection var ansible_shell_type to sh 13830 1727204092.58658: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204092.58730: Set connection var ansible_pipelining to False 13830 1727204092.58758: variable 'ansible_shell_executable' from source: unknown 13830 1727204092.58769: variable 'ansible_connection' from source: unknown 13830 1727204092.58777: variable 'ansible_module_compression' from source: unknown 13830 1727204092.58784: variable 'ansible_shell_type' from source: unknown 13830 1727204092.58830: variable 'ansible_shell_executable' from source: unknown 13830 1727204092.58838: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204092.58852: variable 'ansible_pipelining' from source: unknown 13830 1727204092.58939: variable 'ansible_timeout' from source: unknown 13830 1727204092.58948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204092.59126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204092.59381: variable 'omit' from source: magic vars 13830 1727204092.59391: starting attempt loop 13830 1727204092.59399: running the handler 13830 1727204092.59410: _low_level_execute_command(): starting 13830 1727204092.59418: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204092.61257: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.61262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.61297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204092.61301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.61303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.61490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.61493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.61496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.61561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.63247: stdout chunk (state=3): >>>/root <<< 13830 1727204092.63346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.63433: stderr chunk (state=3): >>><<< 13830 1727204092.63437: stdout chunk (state=3): >>><<< 13830 1727204092.63542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.63546: _low_level_execute_command(): starting 13830 1727204092.63549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598 `" && echo ansible-tmp-1727204092.6345475-15434-105226490715598="` echo /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598 `" ) && sleep 0' 13830 1727204092.65006: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.65010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.65049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204092.65053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.65056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.65058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.65238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.65242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.65244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.65311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.67265: stdout chunk (state=3): >>>ansible-tmp-1727204092.6345475-15434-105226490715598=/root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598 <<< 13830 1727204092.67381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.67460: stderr chunk (state=3): >>><<< 13830 1727204092.67463: stdout chunk (state=3): >>><<< 13830 1727204092.67482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204092.6345475-15434-105226490715598=/root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.67502: variable 'ansible_module_compression' from source: unknown 13830 1727204092.67541: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204092.67558: variable 'ansible_facts' from source: unknown 13830 1727204092.67616: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598/AnsiballZ_command.py 13830 1727204092.68709: Sending initial data 13830 1727204092.68713: Sent initial data (156 bytes) 13830 1727204092.71787: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.71792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.71911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.71915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.71987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204092.71993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.72117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.72123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.72145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.72256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.73929: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204092.73968: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204092.74010: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpg1jwepdy /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598/AnsiballZ_command.py <<< 13830 1727204092.74048: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204092.75462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.75570: stderr chunk (state=3): >>><<< 13830 1727204092.75573: stdout chunk (state=3): >>><<< 13830 1727204092.75575: done transferring module to remote 13830 1727204092.75577: _low_level_execute_command(): starting 13830 1727204092.75579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598/ /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598/AnsiballZ_command.py && sleep 0' 13830 1727204092.77339: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.77383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.77399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.77532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.77579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.77591: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.77605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.77645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.77658: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.77672: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.77685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.77698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.77714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.77726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.77747: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.77761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.77920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.77972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.77987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.78181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.79886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204092.79946: stderr chunk (state=3): >>><<< 13830 1727204092.79949: stdout chunk (state=3): >>><<< 13830 1727204092.79969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204092.80055: _low_level_execute_command(): starting 13830 1727204092.80059: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598/AnsiballZ_command.py && sleep 0' 13830 1727204092.81508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204092.81587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.81604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.81624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.81672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.81692: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204092.81708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.81726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204092.81817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204092.81829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204092.81842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.81855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.81874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.81887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204092.81898: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204092.81916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.81995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.82037: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.82053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.82145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204092.96043: stdout chunk (state=3): >>> {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-24 14:54:52.956190", "end": "2024-09-24 14:54:52.959375", "delta": "0:00:00.003185", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204092.97517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204092.97522: stdout chunk (state=3): >>><<< 13830 1727204092.97526: stderr chunk (state=3): >>><<< 13830 1727204092.97666: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-24 14:54:52.956190", "end": "2024-09-24 14:54:52.959375", "delta": "0:00:00.003185", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204092.97670: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_user_port_key', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204092.97672: _low_level_execute_command(): starting 13830 1727204092.97675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204092.6345475-15434-105226490715598/ > /dev/null 2>&1 && sleep 0' 13830 1727204092.99033: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204092.99037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204092.99191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204092.99195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204092.99197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204092.99257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204092.99387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204092.99390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204092.99451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.01405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.01409: stdout chunk (state=3): >>><<< 13830 1727204093.01414: stderr chunk (state=3): >>><<< 13830 1727204093.01438: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.01446: handler run complete 13830 1727204093.01468: Evaluated conditional (False): False 13830 1727204093.01622: variable 'bond_opt' from source: unknown 13830 1727204093.01627: variable 'result' from source: unknown 13830 1727204093.01646: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204093.01656: attempt loop complete, returning result 13830 1727204093.01678: variable 'bond_opt' from source: unknown 13830 1727204093.01747: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'ad_user_port_key', 'value': '1023'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_user_port_key", "value": "1023" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key" ], "delta": "0:00:00.003185", "end": "2024-09-24 14:54:52.959375", "rc": 0, "start": "2024-09-24 14:54:52.956190" } STDOUT: 1023 13830 1727204093.01887: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.01891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.01893: variable 'omit' from source: magic vars 13830 1727204093.02020: variable 'ansible_distribution_major_version' from source: facts 13830 1727204093.02026: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204093.02030: variable 'omit' from source: magic vars 13830 1727204093.02049: variable 'omit' from source: magic vars 13830 1727204093.02439: variable 'controller_device' from source: play vars 13830 1727204093.02442: variable 'bond_opt' from source: unknown 13830 1727204093.02461: variable 'omit' from source: magic vars 13830 1727204093.02482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204093.02489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204093.02496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204093.02508: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204093.02510: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.02515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.02705: Set connection var ansible_connection to ssh 13830 1727204093.02714: Set connection var ansible_timeout to 10 13830 1727204093.02719: Set connection var ansible_shell_executable to /bin/sh 13830 1727204093.02722: Set connection var ansible_shell_type to sh 13830 1727204093.02728: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204093.02856: Set connection var ansible_pipelining to False 13830 1727204093.02879: variable 'ansible_shell_executable' from source: unknown 13830 1727204093.02882: variable 'ansible_connection' from source: unknown 13830 1727204093.02885: variable 'ansible_module_compression' from source: unknown 13830 1727204093.02887: variable 'ansible_shell_type' from source: unknown 13830 1727204093.02889: variable 'ansible_shell_executable' from source: unknown 13830 1727204093.02892: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.02894: variable 'ansible_pipelining' from source: unknown 13830 1727204093.02896: variable 'ansible_timeout' from source: unknown 13830 1727204093.02901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.03114: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204093.03122: variable 'omit' from source: magic vars 13830 1727204093.03125: starting attempt loop 13830 1727204093.03127: running the handler 13830 1727204093.03137: _low_level_execute_command(): starting 13830 1727204093.03140: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204093.05001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.05010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.05023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.05045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.05090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.05148: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.05158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.05173: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.05181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.05188: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.05196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.05205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.05218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.05256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.05262: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.05279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.05349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.05486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.05502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.05576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.07254: stdout chunk (state=3): >>>/root <<< 13830 1727204093.07384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.07442: stderr chunk (state=3): >>><<< 13830 1727204093.07446: stdout chunk (state=3): >>><<< 13830 1727204093.07472: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.07559: _low_level_execute_command(): starting 13830 1727204093.07570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457 `" && echo ansible-tmp-1727204093.0747688-15434-42749574477457="` echo /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457 `" ) && sleep 0' 13830 1727204093.09151: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.09156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.09190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204093.09194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.09196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.09434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.09538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.09541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.09600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.11572: stdout chunk (state=3): >>>ansible-tmp-1727204093.0747688-15434-42749574477457=/root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457 <<< 13830 1727204093.11680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.11776: stderr chunk (state=3): >>><<< 13830 1727204093.11779: stdout chunk (state=3): >>><<< 13830 1727204093.11799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204093.0747688-15434-42749574477457=/root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.11822: variable 'ansible_module_compression' from source: unknown 13830 1727204093.11867: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204093.11889: variable 'ansible_facts' from source: unknown 13830 1727204093.11954: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457/AnsiballZ_command.py 13830 1727204093.12581: Sending initial data 13830 1727204093.12584: Sent initial data (155 bytes) 13830 1727204093.15219: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.15268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.15280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.15303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.15348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.15480: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.15490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.15504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.15511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.15518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.15529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.15540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.15552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.15559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.15567: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.15580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.15657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.15711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.15724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.15795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.17529: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204093.17562: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204093.17600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmprruma6fe /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457/AnsiballZ_command.py <<< 13830 1727204093.17639: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204093.18984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.19124: stderr chunk (state=3): >>><<< 13830 1727204093.19128: stdout chunk (state=3): >>><<< 13830 1727204093.19150: done transferring module to remote 13830 1727204093.19158: _low_level_execute_command(): starting 13830 1727204093.19165: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457/ /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457/AnsiballZ_command.py && sleep 0' 13830 1727204093.21866: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.21884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.21899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.21916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.21970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.21983: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.21997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.22015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.22046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.22057: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.22072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.22085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.22158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.22174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.22185: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.22198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.22393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.22411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.22425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.22593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.24385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.24389: stdout chunk (state=3): >>><<< 13830 1727204093.24392: stderr chunk (state=3): >>><<< 13830 1727204093.24495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.24499: _low_level_execute_command(): starting 13830 1727204093.24501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457/AnsiballZ_command.py && sleep 0' 13830 1727204093.25958: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.26029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.26047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.26068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.26111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.26245: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.26261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.26284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.26297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.26309: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.26321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.26336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.26351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.26361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.26374: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.26385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.26456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.26581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.26597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.26792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.40576: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-24 14:54:53.401466", "end": "2024-09-24 14:54:53.404681", "delta": "0:00:00.003215", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204093.41999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204093.42003: stdout chunk (state=3): >>><<< 13830 1727204093.42005: stderr chunk (state=3): >>><<< 13830 1727204093.42071: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-24 14:54:53.401466", "end": "2024-09-24 14:54:53.404681", "delta": "0:00:00.003215", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204093.42080: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/all_slaves_active', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204093.42083: _low_level_execute_command(): starting 13830 1727204093.42085: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204093.0747688-15434-42749574477457/ > /dev/null 2>&1 && sleep 0' 13830 1727204093.43746: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.43761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.43780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.43804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.43851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.43907: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.43921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.43941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.43952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.43962: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.43976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.43989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.44006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.44127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.44143: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.44157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.44237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.44255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.44271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.44454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.46283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.46384: stderr chunk (state=3): >>><<< 13830 1727204093.46387: stdout chunk (state=3): >>><<< 13830 1727204093.46672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.46675: handler run complete 13830 1727204093.46678: Evaluated conditional (False): False 13830 1727204093.46680: variable 'bond_opt' from source: unknown 13830 1727204093.46682: variable 'result' from source: unknown 13830 1727204093.46684: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204093.46685: attempt loop complete, returning result 13830 1727204093.46687: variable 'bond_opt' from source: unknown 13830 1727204093.46719: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'all_slaves_active', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "all_slaves_active", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/all_slaves_active" ], "delta": "0:00:00.003215", "end": "2024-09-24 14:54:53.404681", "rc": 0, "start": "2024-09-24 14:54:53.401466" } STDOUT: 1 13830 1727204093.46954: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.46972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.46988: variable 'omit' from source: magic vars 13830 1727204093.47195: variable 'ansible_distribution_major_version' from source: facts 13830 1727204093.47328: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204093.47340: variable 'omit' from source: magic vars 13830 1727204093.47358: variable 'omit' from source: magic vars 13830 1727204093.47643: variable 'controller_device' from source: play vars 13830 1727204093.47767: variable 'bond_opt' from source: unknown 13830 1727204093.47792: variable 'omit' from source: magic vars 13830 1727204093.47816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204093.47829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204093.47843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204093.47862: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204093.47987: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.47995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.48077: Set connection var ansible_connection to ssh 13830 1727204093.48208: Set connection var ansible_timeout to 10 13830 1727204093.48300: Set connection var ansible_shell_executable to /bin/sh 13830 1727204093.48304: Set connection var ansible_shell_type to sh 13830 1727204093.48306: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204093.48316: Set connection var ansible_pipelining to False 13830 1727204093.48338: variable 'ansible_shell_executable' from source: unknown 13830 1727204093.48341: variable 'ansible_connection' from source: unknown 13830 1727204093.48343: variable 'ansible_module_compression' from source: unknown 13830 1727204093.48345: variable 'ansible_shell_type' from source: unknown 13830 1727204093.48348: variable 'ansible_shell_executable' from source: unknown 13830 1727204093.48350: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.48357: variable 'ansible_pipelining' from source: unknown 13830 1727204093.48359: variable 'ansible_timeout' from source: unknown 13830 1727204093.48361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.48571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204093.48580: variable 'omit' from source: magic vars 13830 1727204093.48583: starting attempt loop 13830 1727204093.48585: running the handler 13830 1727204093.48592: _low_level_execute_command(): starting 13830 1727204093.48596: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204093.51049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.51057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.51069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.51084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.51128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.51136: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.51146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.51160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.51169: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.51176: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.51183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.51192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.51203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.51211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.51218: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.51227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.51335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.51476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.51489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.51566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.53243: stdout chunk (state=3): >>>/root <<< 13830 1727204093.53415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.53420: stdout chunk (state=3): >>><<< 13830 1727204093.53425: stderr chunk (state=3): >>><<< 13830 1727204093.53445: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.53453: _low_level_execute_command(): starting 13830 1727204093.53459: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553 `" && echo ansible-tmp-1727204093.5344412-15434-208239244513553="` echo /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553 `" ) && sleep 0' 13830 1727204093.55037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.55109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.55119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.55135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.55173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.55216: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.55226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.55238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.55245: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.55251: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.55258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.55269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.55279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.55286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.55292: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.55300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.55487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.55505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.55516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.55600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.57539: stdout chunk (state=3): >>>ansible-tmp-1727204093.5344412-15434-208239244513553=/root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553 <<< 13830 1727204093.57727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.57733: stdout chunk (state=3): >>><<< 13830 1727204093.57736: stderr chunk (state=3): >>><<< 13830 1727204093.57767: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204093.5344412-15434-208239244513553=/root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.57795: variable 'ansible_module_compression' from source: unknown 13830 1727204093.57836: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204093.57854: variable 'ansible_facts' from source: unknown 13830 1727204093.57920: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553/AnsiballZ_command.py 13830 1727204093.58556: Sending initial data 13830 1727204093.58559: Sent initial data (156 bytes) 13830 1727204093.62461: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.62885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.62896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.62911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.62955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.62962: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.62975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.62988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.62996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.63003: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.63011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.63020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.63034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.63037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.63045: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.63055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.63132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.63590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.63603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.63676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.65393: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204093.65424: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204093.65471: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp2181mxos /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553/AnsiballZ_command.py <<< 13830 1727204093.65499: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204093.66949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.66955: stderr chunk (state=3): >>><<< 13830 1727204093.66958: stdout chunk (state=3): >>><<< 13830 1727204093.66990: done transferring module to remote 13830 1727204093.66997: _low_level_execute_command(): starting 13830 1727204093.67002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553/ /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553/AnsiballZ_command.py && sleep 0' 13830 1727204093.68840: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.68881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.68890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.68904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.68945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.69036: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.69044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.69058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.69067: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.69075: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.69083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.69094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.69104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.69112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.69119: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.69132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.69207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.69288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.69301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.69377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.71151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.71155: stdout chunk (state=3): >>><<< 13830 1727204093.71161: stderr chunk (state=3): >>><<< 13830 1727204093.71185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.71193: _low_level_execute_command(): starting 13830 1727204093.71196: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553/AnsiballZ_command.py && sleep 0' 13830 1727204093.73008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.73016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.73035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.73047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.73089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.73134: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.73151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.73166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.73255: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.73261: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.73271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.73281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.73293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.73300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.73307: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.73316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.73398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.73481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.73493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.73569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.87382: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-24 14:54:53.869621", "end": "2024-09-24 14:54:53.872825", "delta": "0:00:00.003204", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204093.88690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204093.88788: stderr chunk (state=3): >>><<< 13830 1727204093.88791: stdout chunk (state=3): >>><<< 13830 1727204093.88813: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-24 14:54:53.869621", "end": "2024-09-24 14:54:53.872825", "delta": "0:00:00.003204", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204093.88843: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/downdelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204093.88848: _low_level_execute_command(): starting 13830 1727204093.88854: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204093.5344412-15434-208239244513553/ > /dev/null 2>&1 && sleep 0' 13830 1727204093.90566: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.90627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.90638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.90667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.90727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.90914: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.90918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.90921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.90923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.90925: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.90927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.90929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.91066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.91076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.91083: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.91093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.91294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.91313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.91325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.91402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.93345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204093.93349: stdout chunk (state=3): >>><<< 13830 1727204093.93355: stderr chunk (state=3): >>><<< 13830 1727204093.93383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204093.93387: handler run complete 13830 1727204093.93410: Evaluated conditional (False): False 13830 1727204093.93568: variable 'bond_opt' from source: unknown 13830 1727204093.93584: variable 'result' from source: unknown 13830 1727204093.93598: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204093.93611: attempt loop complete, returning result 13830 1727204093.93631: variable 'bond_opt' from source: unknown 13830 1727204093.93712: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'downdelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "downdelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/downdelay" ], "delta": "0:00:00.003204", "end": "2024-09-24 14:54:53.872825", "rc": 0, "start": "2024-09-24 14:54:53.869621" } STDOUT: 0 13830 1727204093.93854: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.93857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.93860: variable 'omit' from source: magic vars 13830 1727204093.94023: variable 'ansible_distribution_major_version' from source: facts 13830 1727204093.94027: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204093.94034: variable 'omit' from source: magic vars 13830 1727204093.94047: variable 'omit' from source: magic vars 13830 1727204093.94234: variable 'controller_device' from source: play vars 13830 1727204093.94238: variable 'bond_opt' from source: unknown 13830 1727204093.94255: variable 'omit' from source: magic vars 13830 1727204093.94276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204093.94284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204093.94291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204093.94343: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204093.94346: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.94349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.94498: Set connection var ansible_connection to ssh 13830 1727204093.95428: Set connection var ansible_timeout to 10 13830 1727204093.95440: Set connection var ansible_shell_executable to /bin/sh 13830 1727204093.95443: Set connection var ansible_shell_type to sh 13830 1727204093.95448: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204093.95458: Set connection var ansible_pipelining to False 13830 1727204093.95482: variable 'ansible_shell_executable' from source: unknown 13830 1727204093.95486: variable 'ansible_connection' from source: unknown 13830 1727204093.95488: variable 'ansible_module_compression' from source: unknown 13830 1727204093.95490: variable 'ansible_shell_type' from source: unknown 13830 1727204093.95493: variable 'ansible_shell_executable' from source: unknown 13830 1727204093.95495: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204093.95499: variable 'ansible_pipelining' from source: unknown 13830 1727204093.95502: variable 'ansible_timeout' from source: unknown 13830 1727204093.95506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204093.95720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204093.95729: variable 'omit' from source: magic vars 13830 1727204093.95734: starting attempt loop 13830 1727204093.95737: running the handler 13830 1727204093.95773: _low_level_execute_command(): starting 13830 1727204093.95776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204093.97694: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204093.97744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.97756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.97772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.97810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.97865: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204093.97875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.97888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204093.97896: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204093.97902: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204093.97910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204093.97958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204093.97975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204093.97983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204093.97990: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204093.97999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204093.98199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204093.98216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204093.98228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204093.98303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204093.99971: stdout chunk (state=3): >>>/root <<< 13830 1727204094.00151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.00155: stdout chunk (state=3): >>><<< 13830 1727204094.00161: stderr chunk (state=3): >>><<< 13830 1727204094.00190: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.00198: _low_level_execute_command(): starting 13830 1727204094.00204: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059 `" && echo ansible-tmp-1727204094.0018888-15434-57011994833059="` echo /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059 `" ) && sleep 0' 13830 1727204094.01837: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.01854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.01871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.01890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.01937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.01950: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.01966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.01985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.01996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.02007: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.02021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.02039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.02055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.02071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.02083: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.02096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.02174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.02290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.02307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.02387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.04271: stdout chunk (state=3): >>>ansible-tmp-1727204094.0018888-15434-57011994833059=/root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059 <<< 13830 1727204094.04469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.04473: stdout chunk (state=3): >>><<< 13830 1727204094.04475: stderr chunk (state=3): >>><<< 13830 1727204094.04674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204094.0018888-15434-57011994833059=/root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.04677: variable 'ansible_module_compression' from source: unknown 13830 1727204094.04680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204094.04682: variable 'ansible_facts' from source: unknown 13830 1727204094.04684: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059/AnsiballZ_command.py 13830 1727204094.05397: Sending initial data 13830 1727204094.05400: Sent initial data (155 bytes) 13830 1727204094.08032: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.08052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.08071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.08090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.08181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.08193: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.08207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.08226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.08255: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.08268: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.08281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.08293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.08307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.08368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.08384: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.08398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.08595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.08617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.08633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.08705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.10429: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204094.10458: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204094.10499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpbjf26jd8 /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059/AnsiballZ_command.py <<< 13830 1727204094.10556: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204094.11987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.12072: stderr chunk (state=3): >>><<< 13830 1727204094.12075: stdout chunk (state=3): >>><<< 13830 1727204094.12077: done transferring module to remote 13830 1727204094.12079: _low_level_execute_command(): starting 13830 1727204094.12082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059/ /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059/AnsiballZ_command.py && sleep 0' 13830 1727204094.13739: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.13743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.13781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204094.13785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.13787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.13976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.14032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.14199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.15820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.15824: stdout chunk (state=3): >>><<< 13830 1727204094.15830: stderr chunk (state=3): >>><<< 13830 1727204094.15858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.15861: _low_level_execute_command(): starting 13830 1727204094.15867: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059/AnsiballZ_command.py && sleep 0' 13830 1727204094.17520: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.17529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.17543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.17558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.17603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.17616: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.17626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.17643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.17650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.17657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.17667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.17677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.17689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.17698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.17706: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.17726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.17802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.17824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.17846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.17917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.31713: stdout chunk (state=3): >>> {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-24 14:54:54.312800", "end": "2024-09-24 14:54:54.316141", "delta": "0:00:00.003341", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204094.33163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204094.33175: stdout chunk (state=3): >>><<< 13830 1727204094.33178: stderr chunk (state=3): >>><<< 13830 1727204094.33196: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-24 14:54:54.312800", "end": "2024-09-24 14:54:54.316141", "delta": "0:00:00.003341", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204094.33228: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lacp_rate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204094.33236: _low_level_execute_command(): starting 13830 1727204094.33241: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204094.0018888-15434-57011994833059/ > /dev/null 2>&1 && sleep 0' 13830 1727204094.34827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.34963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.34979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.34994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.35039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.35048: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.35063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.35182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.35191: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.35198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.35205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.35216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.35227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.35238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.35243: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.35254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.35338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.35391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.35408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.35489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.37384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.37479: stderr chunk (state=3): >>><<< 13830 1727204094.37483: stdout chunk (state=3): >>><<< 13830 1727204094.37504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.37509: handler run complete 13830 1727204094.37536: Evaluated conditional (False): False 13830 1727204094.37697: variable 'bond_opt' from source: unknown 13830 1727204094.37702: variable 'result' from source: unknown 13830 1727204094.37717: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204094.37728: attempt loop complete, returning result 13830 1727204094.37751: variable 'bond_opt' from source: unknown 13830 1727204094.37823: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'lacp_rate', 'value': 'slow'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lacp_rate", "value": "slow" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lacp_rate" ], "delta": "0:00:00.003341", "end": "2024-09-24 14:54:54.316141", "rc": 0, "start": "2024-09-24 14:54:54.312800" } STDOUT: slow 0 13830 1727204094.37967: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204094.37970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204094.37973: variable 'omit' from source: magic vars 13830 1727204094.38143: variable 'ansible_distribution_major_version' from source: facts 13830 1727204094.38148: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204094.38155: variable 'omit' from source: magic vars 13830 1727204094.38169: variable 'omit' from source: magic vars 13830 1727204094.38642: variable 'controller_device' from source: play vars 13830 1727204094.38648: variable 'bond_opt' from source: unknown 13830 1727204094.38671: variable 'omit' from source: magic vars 13830 1727204094.38695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204094.38704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204094.38711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204094.38727: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204094.38730: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204094.38737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204094.38831: Set connection var ansible_connection to ssh 13830 1727204094.38844: Set connection var ansible_timeout to 10 13830 1727204094.38849: Set connection var ansible_shell_executable to /bin/sh 13830 1727204094.38858: Set connection var ansible_shell_type to sh 13830 1727204094.38865: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204094.38875: Set connection var ansible_pipelining to False 13830 1727204094.38897: variable 'ansible_shell_executable' from source: unknown 13830 1727204094.38901: variable 'ansible_connection' from source: unknown 13830 1727204094.38903: variable 'ansible_module_compression' from source: unknown 13830 1727204094.38905: variable 'ansible_shell_type' from source: unknown 13830 1727204094.38908: variable 'ansible_shell_executable' from source: unknown 13830 1727204094.38910: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204094.38915: variable 'ansible_pipelining' from source: unknown 13830 1727204094.38917: variable 'ansible_timeout' from source: unknown 13830 1727204094.38922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204094.39024: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204094.39034: variable 'omit' from source: magic vars 13830 1727204094.39039: starting attempt loop 13830 1727204094.39041: running the handler 13830 1727204094.39048: _low_level_execute_command(): starting 13830 1727204094.39051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204094.39788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.39797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.39808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.39822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.39873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.39877: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.39887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.39900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.39907: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.39914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.39921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.39930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.39949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.39956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.39963: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.39977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.40053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.40075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.40088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.40168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.41859: stdout chunk (state=3): >>>/root <<< 13830 1727204094.42069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.42073: stdout chunk (state=3): >>><<< 13830 1727204094.42079: stderr chunk (state=3): >>><<< 13830 1727204094.42107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.42116: _low_level_execute_command(): starting 13830 1727204094.42122: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315 `" && echo ansible-tmp-1727204094.4210663-15434-109678166341315="` echo /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315 `" ) && sleep 0' 13830 1727204094.43071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.43076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.43079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.43081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.43118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.43133: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.43149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.43169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.43182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.43201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.43221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.43238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.43253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.43268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.43279: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.43291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.43424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.43450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.43468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.43583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.45418: stdout chunk (state=3): >>>ansible-tmp-1727204094.4210663-15434-109678166341315=/root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315 <<< 13830 1727204094.45571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.45609: stderr chunk (state=3): >>><<< 13830 1727204094.45612: stdout chunk (state=3): >>><<< 13830 1727204094.45662: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204094.4210663-15434-109678166341315=/root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.45667: variable 'ansible_module_compression' from source: unknown 13830 1727204094.45685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204094.45698: variable 'ansible_facts' from source: unknown 13830 1727204094.45744: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315/AnsiballZ_command.py 13830 1727204094.45847: Sending initial data 13830 1727204094.45850: Sent initial data (156 bytes) 13830 1727204094.47511: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.47515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.47570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.47576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.47593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204094.47598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.47691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.47709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.47860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.49475: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204094.49506: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204094.49541: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpx7g9cuql /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315/AnsiballZ_command.py <<< 13830 1727204094.49573: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204094.50979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.51147: stderr chunk (state=3): >>><<< 13830 1727204094.51151: stdout chunk (state=3): >>><<< 13830 1727204094.51153: done transferring module to remote 13830 1727204094.51155: _low_level_execute_command(): starting 13830 1727204094.51158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315/ /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315/AnsiballZ_command.py && sleep 0' 13830 1727204094.51980: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.52031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.52050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.52078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.52127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.52181: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.52198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.52252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.52270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.52287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.52301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.52316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.52333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.52354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.52370: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.52389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.52584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.52604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.52624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.52736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.54520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.54524: stdout chunk (state=3): >>><<< 13830 1727204094.54526: stderr chunk (state=3): >>><<< 13830 1727204094.54571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.54575: _low_level_execute_command(): starting 13830 1727204094.54577: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315/AnsiballZ_command.py && sleep 0' 13830 1727204094.55781: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.55816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.55839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.55862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.55925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.55983: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.55986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.55988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.56816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.56823: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.56832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.56847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.56857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.56866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.56883: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.56898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.57238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.57601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.57617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.57708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.71554: stdout chunk (state=3): >>> {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-24 14:54:54.711337", "end": "2024-09-24 14:54:54.714535", "delta": "0:00:00.003198", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204094.73016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204094.73021: stdout chunk (state=3): >>><<< 13830 1727204094.73026: stderr chunk (state=3): >>><<< 13830 1727204094.73045: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-24 14:54:54.711337", "end": "2024-09-24 14:54:54.714535", "delta": "0:00:00.003198", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204094.73082: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204094.73090: _low_level_execute_command(): starting 13830 1727204094.73092: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204094.4210663-15434-109678166341315/ > /dev/null 2>&1 && sleep 0' 13830 1727204094.73734: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.73740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.73753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.73766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.73809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.73821: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.73824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.73839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.73847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.73853: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.73860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.73880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.73885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.73894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.73900: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.73913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.73982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.73999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.74013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.74114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.76017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.76380: stderr chunk (state=3): >>><<< 13830 1727204094.76415: stdout chunk (state=3): >>><<< 13830 1727204094.76461: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.76467: handler run complete 13830 1727204094.76491: Evaluated conditional (False): False 13830 1727204094.76805: variable 'bond_opt' from source: unknown 13830 1727204094.76812: variable 'result' from source: unknown 13830 1727204094.76834: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204094.76843: attempt loop complete, returning result 13830 1727204094.76862: variable 'bond_opt' from source: unknown 13830 1727204094.76946: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'lp_interval', 'value': '128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lp_interval", "value": "128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lp_interval" ], "delta": "0:00:00.003198", "end": "2024-09-24 14:54:54.714535", "rc": 0, "start": "2024-09-24 14:54:54.711337" } STDOUT: 128 13830 1727204094.77287: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204094.77296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204094.77399: variable 'omit' from source: magic vars 13830 1727204094.77974: variable 'ansible_distribution_major_version' from source: facts 13830 1727204094.77978: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204094.77980: variable 'omit' from source: magic vars 13830 1727204094.78002: variable 'omit' from source: magic vars 13830 1727204094.78217: variable 'controller_device' from source: play vars 13830 1727204094.78220: variable 'bond_opt' from source: unknown 13830 1727204094.78245: variable 'omit' from source: magic vars 13830 1727204094.78335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204094.78468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204094.78480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204094.78523: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204094.78526: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204094.78532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204094.78619: Set connection var ansible_connection to ssh 13830 1727204094.78628: Set connection var ansible_timeout to 10 13830 1727204094.78634: Set connection var ansible_shell_executable to /bin/sh 13830 1727204094.78637: Set connection var ansible_shell_type to sh 13830 1727204094.78648: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204094.78658: Set connection var ansible_pipelining to False 13830 1727204094.78681: variable 'ansible_shell_executable' from source: unknown 13830 1727204094.78684: variable 'ansible_connection' from source: unknown 13830 1727204094.78686: variable 'ansible_module_compression' from source: unknown 13830 1727204094.78689: variable 'ansible_shell_type' from source: unknown 13830 1727204094.78691: variable 'ansible_shell_executable' from source: unknown 13830 1727204094.78693: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204094.78698: variable 'ansible_pipelining' from source: unknown 13830 1727204094.78700: variable 'ansible_timeout' from source: unknown 13830 1727204094.78705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204094.78962: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204094.78972: variable 'omit' from source: magic vars 13830 1727204094.78975: starting attempt loop 13830 1727204094.78986: running the handler 13830 1727204094.79018: _low_level_execute_command(): starting 13830 1727204094.79025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204094.79917: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.79928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.79944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.79958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.80010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.80018: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.80028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.80043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.80054: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.80060: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.80070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.80082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.80093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.80104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.80110: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.80120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.80227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.80243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.80257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.80342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.82012: stdout chunk (state=3): >>>/root <<< 13830 1727204094.82203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.82206: stdout chunk (state=3): >>><<< 13830 1727204094.82209: stderr chunk (state=3): >>><<< 13830 1727204094.82308: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.82312: _low_level_execute_command(): starting 13830 1727204094.82314: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891 `" && echo ansible-tmp-1727204094.8222501-15434-275160821116891="` echo /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891 `" ) && sleep 0' 13830 1727204094.82919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.82933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.82951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.82979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.83021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.83037: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.83056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.83078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.83093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204094.83103: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204094.83114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.83132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.83155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.83171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.83186: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.83205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.83288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.83309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.83332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.83415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.85288: stdout chunk (state=3): >>>ansible-tmp-1727204094.8222501-15434-275160821116891=/root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891 <<< 13830 1727204094.85450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.85509: stderr chunk (state=3): >>><<< 13830 1727204094.85528: stdout chunk (state=3): >>><<< 13830 1727204094.85554: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204094.8222501-15434-275160821116891=/root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.85619: variable 'ansible_module_compression' from source: unknown 13830 1727204094.85692: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204094.85710: variable 'ansible_facts' from source: unknown 13830 1727204094.85796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891/AnsiballZ_command.py 13830 1727204094.86059: Sending initial data 13830 1727204094.86063: Sent initial data (156 bytes) 13830 1727204094.87144: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.87151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.87189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.87193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.87203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.87210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.87215: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204094.87224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.87309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.87312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.87323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.87390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.89088: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204094.89138: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204094.89173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp7x5o9rxp /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891/AnsiballZ_command.py <<< 13830 1727204094.89217: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204094.90125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.90271: stderr chunk (state=3): >>><<< 13830 1727204094.90274: stdout chunk (state=3): >>><<< 13830 1727204094.90276: done transferring module to remote 13830 1727204094.90279: _low_level_execute_command(): starting 13830 1727204094.90281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891/ /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891/AnsiballZ_command.py && sleep 0' 13830 1727204094.90997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204094.91001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.91021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.91028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.91079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204094.91083: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204094.91098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.91108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204094.91113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.91120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.91129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.91138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.91242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.91245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.91286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.91331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204094.93057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204094.93116: stderr chunk (state=3): >>><<< 13830 1727204094.93119: stdout chunk (state=3): >>><<< 13830 1727204094.93133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204094.93136: _low_level_execute_command(): starting 13830 1727204094.93138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891/AnsiballZ_command.py && sleep 0' 13830 1727204094.93676: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204094.93683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204094.93732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.93740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204094.93742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204094.93788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204094.93794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204094.93817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204094.93893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.07848: stdout chunk (state=3): >>> {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-24 14:54:55.074253", "end": "2024-09-24 14:54:55.077562", "delta": "0:00:00.003309", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204095.09201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204095.09283: stderr chunk (state=3): >>><<< 13830 1727204095.09288: stdout chunk (state=3): >>><<< 13830 1727204095.09308: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-24 14:54:55.074253", "end": "2024-09-24 14:54:55.077562", "delta": "0:00:00.003309", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204095.09341: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/miimon', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204095.09346: _low_level_execute_command(): starting 13830 1727204095.09351: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204094.8222501-15434-275160821116891/ > /dev/null 2>&1 && sleep 0' 13830 1727204095.10000: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.10003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.10036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.10039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.10041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.10100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.10103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.10151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.12483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.12488: stdout chunk (state=3): >>><<< 13830 1727204095.12493: stderr chunk (state=3): >>><<< 13830 1727204095.12496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.12499: handler run complete 13830 1727204095.12501: Evaluated conditional (False): False 13830 1727204095.12503: variable 'bond_opt' from source: unknown 13830 1727204095.12505: variable 'result' from source: unknown 13830 1727204095.12507: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204095.12508: attempt loop complete, returning result 13830 1727204095.12510: variable 'bond_opt' from source: unknown 13830 1727204095.12547: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'miimon', 'value': '110'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "miimon", "value": "110" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/miimon" ], "delta": "0:00:00.003309", "end": "2024-09-24 14:54:55.077562", "rc": 0, "start": "2024-09-24 14:54:55.074253" } STDOUT: 110 13830 1727204095.12696: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.12699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.12703: variable 'omit' from source: magic vars 13830 1727204095.12859: variable 'ansible_distribution_major_version' from source: facts 13830 1727204095.12866: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204095.12870: variable 'omit' from source: magic vars 13830 1727204095.12884: variable 'omit' from source: magic vars 13830 1727204095.13051: variable 'controller_device' from source: play vars 13830 1727204095.13061: variable 'bond_opt' from source: unknown 13830 1727204095.13091: variable 'omit' from source: magic vars 13830 1727204095.13116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204095.13130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204095.13141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204095.13159: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204095.13170: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.13178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.13262: Set connection var ansible_connection to ssh 13830 1727204095.13282: Set connection var ansible_timeout to 10 13830 1727204095.13295: Set connection var ansible_shell_executable to /bin/sh 13830 1727204095.13304: Set connection var ansible_shell_type to sh 13830 1727204095.13313: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204095.13325: Set connection var ansible_pipelining to False 13830 1727204095.13348: variable 'ansible_shell_executable' from source: unknown 13830 1727204095.13354: variable 'ansible_connection' from source: unknown 13830 1727204095.13360: variable 'ansible_module_compression' from source: unknown 13830 1727204095.13369: variable 'ansible_shell_type' from source: unknown 13830 1727204095.13375: variable 'ansible_shell_executable' from source: unknown 13830 1727204095.13380: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.13386: variable 'ansible_pipelining' from source: unknown 13830 1727204095.13391: variable 'ansible_timeout' from source: unknown 13830 1727204095.13398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.13498: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204095.13514: variable 'omit' from source: magic vars 13830 1727204095.13528: starting attempt loop 13830 1727204095.13536: running the handler 13830 1727204095.13546: _low_level_execute_command(): starting 13830 1727204095.13553: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204095.14146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.14152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.14192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.14197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.14208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.14213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.14218: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.14226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.14290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.14293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.14303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.14358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.16039: stdout chunk (state=3): >>>/root <<< 13830 1727204095.16143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.16222: stderr chunk (state=3): >>><<< 13830 1727204095.16225: stdout chunk (state=3): >>><<< 13830 1727204095.16253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.16262: _low_level_execute_command(): starting 13830 1727204095.16271: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703 `" && echo ansible-tmp-1727204095.1625183-15434-128240149159703="` echo /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703 `" ) && sleep 0' 13830 1727204095.16850: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.16858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.16865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.16888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.16915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.16923: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.16930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.16943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.16950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.16955: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.16966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.16972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.16980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.16983: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.16992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.17042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.17061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.17067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.17129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.19095: stdout chunk (state=3): >>>ansible-tmp-1727204095.1625183-15434-128240149159703=/root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703 <<< 13830 1727204095.19212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.19275: stderr chunk (state=3): >>><<< 13830 1727204095.19278: stdout chunk (state=3): >>><<< 13830 1727204095.19292: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204095.1625183-15434-128240149159703=/root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.19310: variable 'ansible_module_compression' from source: unknown 13830 1727204095.19347: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204095.19366: variable 'ansible_facts' from source: unknown 13830 1727204095.19409: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703/AnsiballZ_command.py 13830 1727204095.19511: Sending initial data 13830 1727204095.19514: Sent initial data (156 bytes) 13830 1727204095.20203: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.20210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.20242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.20249: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.20258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.20271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.20274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.20281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.20290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.20295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.20353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.20358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.20422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.22234: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204095.22288: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204095.22341: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmptldw3rfd /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703/AnsiballZ_command.py <<< 13830 1727204095.22388: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204095.23184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.23293: stderr chunk (state=3): >>><<< 13830 1727204095.23297: stdout chunk (state=3): >>><<< 13830 1727204095.23312: done transferring module to remote 13830 1727204095.23318: _low_level_execute_command(): starting 13830 1727204095.23323: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703/ /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703/AnsiballZ_command.py && sleep 0' 13830 1727204095.23792: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.23809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.23832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.23844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.23858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.23903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.23927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.24003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.25726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.25787: stderr chunk (state=3): >>><<< 13830 1727204095.25790: stdout chunk (state=3): >>><<< 13830 1727204095.25804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.25807: _low_level_execute_command(): starting 13830 1727204095.25812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703/AnsiballZ_command.py && sleep 0' 13830 1727204095.26278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.26297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.26319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204095.26333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.26381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.26393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.26454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.39769: stdout chunk (state=3): >>> {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-24 14:54:55.393812", "end": "2024-09-24 14:54:55.396768", "delta": "0:00:00.002956", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204095.40983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204095.40987: stderr chunk (state=3): >>><<< 13830 1727204095.40990: stdout chunk (state=3): >>><<< 13830 1727204095.41007: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-24 14:54:55.393812", "end": "2024-09-24 14:54:55.396768", "delta": "0:00:00.002956", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204095.41036: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/num_grat_arp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204095.41041: _low_level_execute_command(): starting 13830 1727204095.41046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204095.1625183-15434-128240149159703/ > /dev/null 2>&1 && sleep 0' 13830 1727204095.42203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.42219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.42236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.42254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.42316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.42334: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.42346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.42360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.42375: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.42386: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.42401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.42412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.42429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.42442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.42451: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.42462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.42546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.42563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.42579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.42650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.44491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.44495: stdout chunk (state=3): >>><<< 13830 1727204095.44498: stderr chunk (state=3): >>><<< 13830 1727204095.44772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.44776: handler run complete 13830 1727204095.44778: Evaluated conditional (False): False 13830 1727204095.44780: variable 'bond_opt' from source: unknown 13830 1727204095.44782: variable 'result' from source: unknown 13830 1727204095.44784: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204095.44786: attempt loop complete, returning result 13830 1727204095.44788: variable 'bond_opt' from source: unknown 13830 1727204095.44854: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'num_grat_arp', 'value': '64'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "num_grat_arp", "value": "64" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/num_grat_arp" ], "delta": "0:00:00.002956", "end": "2024-09-24 14:54:55.396768", "rc": 0, "start": "2024-09-24 14:54:55.393812" } STDOUT: 64 13830 1727204095.45088: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.45102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.45118: variable 'omit' from source: magic vars 13830 1727204095.45517: variable 'ansible_distribution_major_version' from source: facts 13830 1727204095.45529: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204095.45541: variable 'omit' from source: magic vars 13830 1727204095.45567: variable 'omit' from source: magic vars 13830 1727204095.45739: variable 'controller_device' from source: play vars 13830 1727204095.45749: variable 'bond_opt' from source: unknown 13830 1727204095.45776: variable 'omit' from source: magic vars 13830 1727204095.45803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204095.45816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204095.45828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204095.45850: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204095.45858: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.45869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.45955: Set connection var ansible_connection to ssh 13830 1727204095.45973: Set connection var ansible_timeout to 10 13830 1727204095.45983: Set connection var ansible_shell_executable to /bin/sh 13830 1727204095.45990: Set connection var ansible_shell_type to sh 13830 1727204095.46005: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204095.46021: Set connection var ansible_pipelining to False 13830 1727204095.46048: variable 'ansible_shell_executable' from source: unknown 13830 1727204095.46056: variable 'ansible_connection' from source: unknown 13830 1727204095.46065: variable 'ansible_module_compression' from source: unknown 13830 1727204095.46073: variable 'ansible_shell_type' from source: unknown 13830 1727204095.46081: variable 'ansible_shell_executable' from source: unknown 13830 1727204095.46088: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.46095: variable 'ansible_pipelining' from source: unknown 13830 1727204095.46103: variable 'ansible_timeout' from source: unknown 13830 1727204095.46122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.46239: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204095.46253: variable 'omit' from source: magic vars 13830 1727204095.46260: starting attempt loop 13830 1727204095.46267: running the handler 13830 1727204095.46276: _low_level_execute_command(): starting 13830 1727204095.46283: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204095.46960: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.46978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.46998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.47017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.47063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.47083: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.47102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.47125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.47141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.47152: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.47166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.47181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.47198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.47215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.47228: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.47245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.47327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.47347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.47361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.47557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.49015: stdout chunk (state=3): >>>/root <<< 13830 1727204095.49214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.49218: stdout chunk (state=3): >>><<< 13830 1727204095.49221: stderr chunk (state=3): >>><<< 13830 1727204095.49335: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.49339: _low_level_execute_command(): starting 13830 1727204095.49341: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374 `" && echo ansible-tmp-1727204095.4924214-15434-83367525500374="` echo /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374 `" ) && sleep 0' 13830 1727204095.50842: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.50916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.50932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.50949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.51033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.51045: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.51057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.51079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.51127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.51140: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.51149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.51159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.51177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.51187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.51199: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.51210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.51405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.51425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.51447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.51570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.53399: stdout chunk (state=3): >>>ansible-tmp-1727204095.4924214-15434-83367525500374=/root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374 <<< 13830 1727204095.53602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.53605: stdout chunk (state=3): >>><<< 13830 1727204095.53607: stderr chunk (state=3): >>><<< 13830 1727204095.53866: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204095.4924214-15434-83367525500374=/root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.53870: variable 'ansible_module_compression' from source: unknown 13830 1727204095.53872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204095.53874: variable 'ansible_facts' from source: unknown 13830 1727204095.53876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374/AnsiballZ_command.py 13830 1727204095.54456: Sending initial data 13830 1727204095.54459: Sent initial data (155 bytes) 13830 1727204095.55542: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.55557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.55573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.55596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.55649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.55661: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.55678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.55695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.55709: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.55727: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.55744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.55757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.55776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.55788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.55799: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.55811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.55892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.55910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.55924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.56039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.57702: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204095.57741: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204095.57790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmppffmb5nt /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374/AnsiballZ_command.py <<< 13830 1727204095.57818: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204095.58990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.59084: stderr chunk (state=3): >>><<< 13830 1727204095.59088: stdout chunk (state=3): >>><<< 13830 1727204095.59112: done transferring module to remote 13830 1727204095.59121: _low_level_execute_command(): starting 13830 1727204095.59124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374/ /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374/AnsiballZ_command.py && sleep 0' 13830 1727204095.59767: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.59780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.59790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.59803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.59842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.59849: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.59864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.59880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.59889: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.59892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.59899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.59908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.59919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.59926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.59932: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.59944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.60015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.60032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.60047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.60112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.61876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.61880: stdout chunk (state=3): >>><<< 13830 1727204095.61886: stderr chunk (state=3): >>><<< 13830 1727204095.61905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.61908: _low_level_execute_command(): starting 13830 1727204095.61914: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374/AnsiballZ_command.py && sleep 0' 13830 1727204095.63244: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.63692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.63702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.63717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.63760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.63769: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.63778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.63792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.63800: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.63806: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.63814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.63823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.63837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.63844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.63850: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.63859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.63937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.63952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.63955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.64282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.77496: stdout chunk (state=3): >>> {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-24 14:54:55.770571", "end": "2024-09-24 14:54:55.774051", "delta": "0:00:00.003480", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204095.78895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204095.78901: stdout chunk (state=3): >>><<< 13830 1727204095.78904: stderr chunk (state=3): >>><<< 13830 1727204095.78927: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-24 14:54:55.770571", "end": "2024-09-24 14:54:55.774051", "delta": "0:00:00.003480", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204095.78955: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/resend_igmp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204095.78958: _low_level_execute_command(): starting 13830 1727204095.78965: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204095.4924214-15434-83367525500374/ > /dev/null 2>&1 && sleep 0' 13830 1727204095.79587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.79597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.79606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.79620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.79654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.79660: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.79672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.79683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.79690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.79698: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.79707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.79718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.79726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.79734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.79738: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.79747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.79813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.79833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.79842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.79914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.81860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.81873: stdout chunk (state=3): >>><<< 13830 1727204095.81877: stderr chunk (state=3): >>><<< 13830 1727204095.81894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.81901: handler run complete 13830 1727204095.81923: Evaluated conditional (False): False 13830 1727204095.82104: variable 'bond_opt' from source: unknown 13830 1727204095.82112: variable 'result' from source: unknown 13830 1727204095.82134: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204095.82149: attempt loop complete, returning result 13830 1727204095.82193: variable 'bond_opt' from source: unknown 13830 1727204095.82349: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'resend_igmp', 'value': '225'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "resend_igmp", "value": "225" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/resend_igmp" ], "delta": "0:00:00.003480", "end": "2024-09-24 14:54:55.774051", "rc": 0, "start": "2024-09-24 14:54:55.770571" } STDOUT: 225 13830 1727204095.82489: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.82492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.82496: variable 'omit' from source: magic vars 13830 1727204095.82733: variable 'ansible_distribution_major_version' from source: facts 13830 1727204095.82741: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204095.82744: variable 'omit' from source: magic vars 13830 1727204095.82773: variable 'omit' from source: magic vars 13830 1727204095.83624: variable 'controller_device' from source: play vars 13830 1727204095.83637: variable 'bond_opt' from source: unknown 13830 1727204095.83661: variable 'omit' from source: magic vars 13830 1727204095.83688: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204095.83702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204095.83712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204095.83732: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204095.83740: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.83748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.83834: Set connection var ansible_connection to ssh 13830 1727204095.83853: Set connection var ansible_timeout to 10 13830 1727204095.83871: Set connection var ansible_shell_executable to /bin/sh 13830 1727204095.83880: Set connection var ansible_shell_type to sh 13830 1727204095.83897: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204095.83915: Set connection var ansible_pipelining to False 13830 1727204095.83945: variable 'ansible_shell_executable' from source: unknown 13830 1727204095.83957: variable 'ansible_connection' from source: unknown 13830 1727204095.83969: variable 'ansible_module_compression' from source: unknown 13830 1727204095.83977: variable 'ansible_shell_type' from source: unknown 13830 1727204095.83988: variable 'ansible_shell_executable' from source: unknown 13830 1727204095.84001: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204095.84017: variable 'ansible_pipelining' from source: unknown 13830 1727204095.84025: variable 'ansible_timeout' from source: unknown 13830 1727204095.84036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204095.84144: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204095.84165: variable 'omit' from source: magic vars 13830 1727204095.84177: starting attempt loop 13830 1727204095.84186: running the handler 13830 1727204095.84202: _low_level_execute_command(): starting 13830 1727204095.84218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204095.84969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.84985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.85000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.85018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.85063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.85079: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.85093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.85112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.85125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.85140: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.85152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.85167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.85182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.85193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.85203: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.85225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.85318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.85340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.85361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.85487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.87100: stdout chunk (state=3): >>>/root <<< 13830 1727204095.87212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.87299: stderr chunk (state=3): >>><<< 13830 1727204095.87310: stdout chunk (state=3): >>><<< 13830 1727204095.87370: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.87375: _low_level_execute_command(): starting 13830 1727204095.87380: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580 `" && echo ansible-tmp-1727204095.873362-15434-206853474504580="` echo /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580 `" ) && sleep 0' 13830 1727204095.88011: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.88026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.88049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.88071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.88114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.88127: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.88148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.88169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.88182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.88195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.88208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.88222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.88242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.88260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.88275: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.88290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.88369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.88396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.88413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.88498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.90507: stdout chunk (state=3): >>>ansible-tmp-1727204095.873362-15434-206853474504580=/root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580 <<< 13830 1727204095.90723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.90726: stdout chunk (state=3): >>><<< 13830 1727204095.90729: stderr chunk (state=3): >>><<< 13830 1727204095.90994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204095.873362-15434-206853474504580=/root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.90998: variable 'ansible_module_compression' from source: unknown 13830 1727204095.91000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204095.91002: variable 'ansible_facts' from source: unknown 13830 1727204095.91004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580/AnsiballZ_command.py 13830 1727204095.91068: Sending initial data 13830 1727204095.91071: Sent initial data (155 bytes) 13830 1727204095.92256: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.92279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.92301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.92320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.92380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.92399: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.92426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.92450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.92472: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.92484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.92496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.92513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.92539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.92559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.92581: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.92603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.92688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.92713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.92728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.92812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.94671: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204095.94725: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204095.94774: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpymsekit8 /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580/AnsiballZ_command.py <<< 13830 1727204095.94812: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204095.95901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.96070: stderr chunk (state=3): >>><<< 13830 1727204095.96073: stdout chunk (state=3): >>><<< 13830 1727204095.96176: done transferring module to remote 13830 1727204095.96180: _low_level_execute_command(): starting 13830 1727204095.96182: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580/ /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580/AnsiballZ_command.py && sleep 0' 13830 1727204095.96889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204095.96913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.96936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.96973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.97015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.97027: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204095.97043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.97070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204095.97084: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204095.97094: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204095.97107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204095.97120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204095.97140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204095.97166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204095.97183: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204095.97198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204095.97312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204095.97351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204095.97397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204095.97521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204095.99377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204095.99468: stderr chunk (state=3): >>><<< 13830 1727204095.99472: stdout chunk (state=3): >>><<< 13830 1727204095.99496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204095.99499: _low_level_execute_command(): starting 13830 1727204095.99501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580/AnsiballZ_command.py && sleep 0' 13830 1727204096.00190: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.00193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.00201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.00250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.00279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.00282: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204096.00290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.00303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204096.00310: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204096.00317: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204096.00325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.00398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.00402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.00404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.00406: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204096.00408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.00476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.00506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.00510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.00628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.13854: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-24 14:54:56.134703", "end": "2024-09-24 14:54:56.137633", "delta": "0:00:00.002930", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204096.14977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204096.15039: stderr chunk (state=3): >>><<< 13830 1727204096.15048: stdout chunk (state=3): >>><<< 13830 1727204096.15069: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-24 14:54:56.134703", "end": "2024-09-24 14:54:56.137633", "delta": "0:00:00.002930", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204096.15096: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/updelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204096.15100: _low_level_execute_command(): starting 13830 1727204096.15105: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204095.873362-15434-206853474504580/ > /dev/null 2>&1 && sleep 0' 13830 1727204096.15642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.15645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.15690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204096.15695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.15698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.15700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.15762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.15769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.15772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.15812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.17555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.17636: stderr chunk (state=3): >>><<< 13830 1727204096.17639: stdout chunk (state=3): >>><<< 13830 1727204096.17653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.17659: handler run complete 13830 1727204096.17679: Evaluated conditional (False): False 13830 1727204096.17787: variable 'bond_opt' from source: unknown 13830 1727204096.17793: variable 'result' from source: unknown 13830 1727204096.17803: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204096.17813: attempt loop complete, returning result 13830 1727204096.17826: variable 'bond_opt' from source: unknown 13830 1727204096.17883: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'updelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "updelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/updelay" ], "delta": "0:00:00.002930", "end": "2024-09-24 14:54:56.137633", "rc": 0, "start": "2024-09-24 14:54:56.134703" } STDOUT: 0 13830 1727204096.18019: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204096.18022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204096.18024: variable 'omit' from source: magic vars 13830 1727204096.18116: variable 'ansible_distribution_major_version' from source: facts 13830 1727204096.18119: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204096.18124: variable 'omit' from source: magic vars 13830 1727204096.18139: variable 'omit' from source: magic vars 13830 1727204096.18249: variable 'controller_device' from source: play vars 13830 1727204096.18261: variable 'bond_opt' from source: unknown 13830 1727204096.18282: variable 'omit' from source: magic vars 13830 1727204096.18308: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204096.18315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204096.18321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204096.18333: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204096.18336: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204096.18338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204096.18404: Set connection var ansible_connection to ssh 13830 1727204096.18414: Set connection var ansible_timeout to 10 13830 1727204096.18436: Set connection var ansible_shell_executable to /bin/sh 13830 1727204096.18439: Set connection var ansible_shell_type to sh 13830 1727204096.18442: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204096.18453: Set connection var ansible_pipelining to False 13830 1727204096.18481: variable 'ansible_shell_executable' from source: unknown 13830 1727204096.18484: variable 'ansible_connection' from source: unknown 13830 1727204096.18487: variable 'ansible_module_compression' from source: unknown 13830 1727204096.18489: variable 'ansible_shell_type' from source: unknown 13830 1727204096.18491: variable 'ansible_shell_executable' from source: unknown 13830 1727204096.18494: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204096.18498: variable 'ansible_pipelining' from source: unknown 13830 1727204096.18504: variable 'ansible_timeout' from source: unknown 13830 1727204096.18506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204096.18634: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204096.18642: variable 'omit' from source: magic vars 13830 1727204096.18645: starting attempt loop 13830 1727204096.18650: running the handler 13830 1727204096.18652: _low_level_execute_command(): starting 13830 1727204096.18654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204096.19307: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.19319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.19356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204096.19376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.19428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.19477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.19492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.19534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.21058: stdout chunk (state=3): >>>/root <<< 13830 1727204096.21161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.21232: stderr chunk (state=3): >>><<< 13830 1727204096.21236: stdout chunk (state=3): >>><<< 13830 1727204096.21252: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.21259: _low_level_execute_command(): starting 13830 1727204096.21265: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708 `" && echo ansible-tmp-1727204096.2125134-15434-121238467734708="` echo /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708 `" ) && sleep 0' 13830 1727204096.21724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.21730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.21765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.21772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204096.21777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.21786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.21792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.21799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.21804: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204096.21809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.21874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.21886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.21934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.23732: stdout chunk (state=3): >>>ansible-tmp-1727204096.2125134-15434-121238467734708=/root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708 <<< 13830 1727204096.23851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.23914: stderr chunk (state=3): >>><<< 13830 1727204096.23919: stdout chunk (state=3): >>><<< 13830 1727204096.23930: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204096.2125134-15434-121238467734708=/root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.23953: variable 'ansible_module_compression' from source: unknown 13830 1727204096.23985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204096.23999: variable 'ansible_facts' from source: unknown 13830 1727204096.24051: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708/AnsiballZ_command.py 13830 1727204096.24149: Sending initial data 13830 1727204096.24154: Sent initial data (156 bytes) 13830 1727204096.27709: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.27808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.27829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.27954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.28306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.28386: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204096.28531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.28711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204096.28870: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204096.29074: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204096.29192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.29289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.29306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.29314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.29365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.29369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.29423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.31263: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204096.31302: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204096.31345: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmph2t4v_yw /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708/AnsiballZ_command.py <<< 13830 1727204096.31380: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204096.32869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.33070: stderr chunk (state=3): >>><<< 13830 1727204096.33073: stdout chunk (state=3): >>><<< 13830 1727204096.33075: done transferring module to remote 13830 1727204096.33077: _low_level_execute_command(): starting 13830 1727204096.33154: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708/ /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708/AnsiballZ_command.py && sleep 0' 13830 1727204096.34206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.34220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.34237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.34256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.34317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.34333: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204096.34347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.34366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204096.34378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204096.34392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204096.34411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.34424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.34442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.34453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.34467: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204096.34482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.34570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.34590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.34606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.34693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.36520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.36611: stderr chunk (state=3): >>><<< 13830 1727204096.36615: stdout chunk (state=3): >>><<< 13830 1727204096.36705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.36709: _low_level_execute_command(): starting 13830 1727204096.36712: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708/AnsiballZ_command.py && sleep 0' 13830 1727204096.38159: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.38235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.38251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.38273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.38318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.38460: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204096.38481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.38498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204096.38509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204096.38519: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204096.38533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.38546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.38569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.38581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.38592: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204096.38606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.38695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.38713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.38787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.39002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.52869: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-24 14:54:56.524469", "end": "2024-09-24 14:54:56.527657", "delta": "0:00:00.003188", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204096.54139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204096.54143: stdout chunk (state=3): >>><<< 13830 1727204096.54145: stderr chunk (state=3): >>><<< 13830 1727204096.54295: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-24 14:54:56.524469", "end": "2024-09-24 14:54:56.527657", "delta": "0:00:00.003188", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204096.54299: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/use_carrier', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204096.54302: _low_level_execute_command(): starting 13830 1727204096.54304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204096.2125134-15434-121238467734708/ > /dev/null 2>&1 && sleep 0' 13830 1727204096.54926: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.54954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.54973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.54993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.55036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.55051: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204096.55077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.55096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204096.55108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204096.55119: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204096.55130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.55144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.55161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.55184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.55196: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204096.55210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.55295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.55318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.55339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.55421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.57217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.57326: stderr chunk (state=3): >>><<< 13830 1727204096.57337: stdout chunk (state=3): >>><<< 13830 1727204096.57374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.57381: handler run complete 13830 1727204096.57479: Evaluated conditional (False): False 13830 1727204096.57592: variable 'bond_opt' from source: unknown 13830 1727204096.57669: variable 'result' from source: unknown 13830 1727204096.57673: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204096.57675: attempt loop complete, returning result 13830 1727204096.57677: variable 'bond_opt' from source: unknown 13830 1727204096.57740: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'use_carrier', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "use_carrier", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/use_carrier" ], "delta": "0:00:00.003188", "end": "2024-09-24 14:54:56.527657", "rc": 0, "start": "2024-09-24 14:54:56.524469" } STDOUT: 1 13830 1727204096.57970: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204096.57983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204096.57996: variable 'omit' from source: magic vars 13830 1727204096.58185: variable 'ansible_distribution_major_version' from source: facts 13830 1727204096.58197: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204096.58206: variable 'omit' from source: magic vars 13830 1727204096.58238: variable 'omit' from source: magic vars 13830 1727204096.58416: variable 'controller_device' from source: play vars 13830 1727204096.58425: variable 'bond_opt' from source: unknown 13830 1727204096.58459: variable 'omit' from source: magic vars 13830 1727204096.58487: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204096.58499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204096.58508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204096.58523: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204096.58529: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204096.58536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204096.58624: Set connection var ansible_connection to ssh 13830 1727204096.58639: Set connection var ansible_timeout to 10 13830 1727204096.58669: Set connection var ansible_shell_executable to /bin/sh 13830 1727204096.58680: Set connection var ansible_shell_type to sh 13830 1727204096.58690: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204096.58702: Set connection var ansible_pipelining to False 13830 1727204096.58724: variable 'ansible_shell_executable' from source: unknown 13830 1727204096.58731: variable 'ansible_connection' from source: unknown 13830 1727204096.58737: variable 'ansible_module_compression' from source: unknown 13830 1727204096.58743: variable 'ansible_shell_type' from source: unknown 13830 1727204096.58749: variable 'ansible_shell_executable' from source: unknown 13830 1727204096.58755: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204096.58761: variable 'ansible_pipelining' from source: unknown 13830 1727204096.58771: variable 'ansible_timeout' from source: unknown 13830 1727204096.58788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204096.58888: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204096.58924: variable 'omit' from source: magic vars 13830 1727204096.58931: starting attempt loop 13830 1727204096.58937: running the handler 13830 1727204096.58946: _low_level_execute_command(): starting 13830 1727204096.58953: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204096.59844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.59858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.59878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.59904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.59945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.59958: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204096.59976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.60006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204096.60021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204096.60032: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204096.60044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.60057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.60075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.60087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.60101: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204096.60125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.60207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.60239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.60256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.60331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.61849: stdout chunk (state=3): >>>/root <<< 13830 1727204096.61977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.62038: stderr chunk (state=3): >>><<< 13830 1727204096.62049: stdout chunk (state=3): >>><<< 13830 1727204096.62174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.62177: _low_level_execute_command(): starting 13830 1727204096.62180: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775 `" && echo ansible-tmp-1727204096.620755-15434-218521520355775="` echo /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775 `" ) && sleep 0' 13830 1727204096.62966: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.62975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.62986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.63000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.63040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.63054: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204096.63057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.63072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204096.63079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204096.63085: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204096.63093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.63101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.63111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.63118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204096.63124: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204096.63134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.63202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.63216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.63226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.63301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.65093: stdout chunk (state=3): >>>ansible-tmp-1727204096.620755-15434-218521520355775=/root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775 <<< 13830 1727204096.65299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.65302: stdout chunk (state=3): >>><<< 13830 1727204096.65305: stderr chunk (state=3): >>><<< 13830 1727204096.65476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204096.620755-15434-218521520355775=/root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.65479: variable 'ansible_module_compression' from source: unknown 13830 1727204096.65481: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204096.65483: variable 'ansible_facts' from source: unknown 13830 1727204096.65568: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775/AnsiballZ_command.py 13830 1727204096.65658: Sending initial data 13830 1727204096.65661: Sent initial data (155 bytes) 13830 1727204096.66657: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.66694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.66698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.66727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.66730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.66732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204096.66734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.66780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.66784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.66791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.66845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.68534: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204096.68567: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204096.68607: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmplisxqhrq /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775/AnsiballZ_command.py <<< 13830 1727204096.68644: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204096.69513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.69797: stderr chunk (state=3): >>><<< 13830 1727204096.69800: stdout chunk (state=3): >>><<< 13830 1727204096.69802: done transferring module to remote 13830 1727204096.69805: _low_level_execute_command(): starting 13830 1727204096.69807: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775/ /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775/AnsiballZ_command.py && sleep 0' 13830 1727204096.70477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.70497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.70540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.70544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.70603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.70609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.70715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.72535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.72590: stderr chunk (state=3): >>><<< 13830 1727204096.72593: stdout chunk (state=3): >>><<< 13830 1727204096.72607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.72610: _low_level_execute_command(): starting 13830 1727204096.72615: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775/AnsiballZ_command.py && sleep 0' 13830 1727204096.73086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.73090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.73117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.73122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.73170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.73182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.73242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.87352: stdout chunk (state=3): >>> {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-24 14:54:56.869430", "end": "2024-09-24 14:54:56.872628", "delta": "0:00:00.003198", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204096.88745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204096.88824: stderr chunk (state=3): >>><<< 13830 1727204096.88827: stdout chunk (state=3): >>><<< 13830 1727204096.88865: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-24 14:54:56.869430", "end": "2024-09-24 14:54:56.872628", "delta": "0:00:00.003198", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204096.88893: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/xmit_hash_policy', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204096.88897: _low_level_execute_command(): starting 13830 1727204096.88905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204096.620755-15434-218521520355775/ > /dev/null 2>&1 && sleep 0' 13830 1727204096.90293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204096.90305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204096.90308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204096.90345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204096.90378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204096.90385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204096.90455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204096.90474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204096.90488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204096.90551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204096.92500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204096.92837: stderr chunk (state=3): >>><<< 13830 1727204096.92840: stdout chunk (state=3): >>><<< 13830 1727204096.92860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204096.92863: handler run complete 13830 1727204096.92889: Evaluated conditional (False): False 13830 1727204096.93063: variable 'bond_opt' from source: unknown 13830 1727204096.93076: variable 'result' from source: unknown 13830 1727204096.93084: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204096.93107: attempt loop complete, returning result 13830 1727204096.93122: variable 'bond_opt' from source: unknown 13830 1727204096.93228: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'xmit_hash_policy', 'value': 'encap2+3'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "xmit_hash_policy", "value": "encap2+3" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy" ], "delta": "0:00:00.003198", "end": "2024-09-24 14:54:56.872628", "rc": 0, "start": "2024-09-24 14:54:56.869430" } STDOUT: encap2+3 3 13830 1727204096.93398: dumping result to json 13830 1727204096.93401: done dumping result, returning 13830 1727204096.93416: done running TaskExecutor() for managed-node3/TASK: ** TEST check bond settings [0affcd87-79f5-1659-6b02-000000000400] 13830 1727204096.93420: sending task result for task 0affcd87-79f5-1659-6b02-000000000400 13830 1727204096.94274: no more pending results, returning what we have 13830 1727204096.94279: results queue empty 13830 1727204096.94280: checking for any_errors_fatal 13830 1727204096.94285: done checking for any_errors_fatal 13830 1727204096.94286: checking for max_fail_percentage 13830 1727204096.94288: done checking for max_fail_percentage 13830 1727204096.94297: checking to see if all hosts have failed and the running result is not ok 13830 1727204096.94298: done checking to see if all hosts have failed 13830 1727204096.94298: getting the remaining hosts for this loop 13830 1727204096.94300: done getting the remaining hosts for this loop 13830 1727204096.94304: getting the next task for host managed-node3 13830 1727204096.94309: done getting next task for host managed-node3 13830 1727204096.94312: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 13830 1727204096.94315: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204096.94319: getting variables 13830 1727204096.94320: in VariableManager get_vars() 13830 1727204096.94344: Calling all_inventory to load vars for managed-node3 13830 1727204096.94347: Calling groups_inventory to load vars for managed-node3 13830 1727204096.94350: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204096.94356: done sending task result for task 0affcd87-79f5-1659-6b02-000000000400 13830 1727204096.94359: WORKER PROCESS EXITING 13830 1727204096.94370: Calling all_plugins_play to load vars for managed-node3 13830 1727204096.94373: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204096.94376: Calling groups_plugins_play to load vars for managed-node3 13830 1727204096.95619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204096.97113: done with get_vars() 13830 1727204096.97142: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Tuesday 24 September 2024 14:54:56 -0400 (0:00:05.888) 0:00:30.050 ***** 13830 1727204096.97241: entering _queue_task() for managed-node3/include_tasks 13830 1727204096.97591: worker is 1 (out of 1 available) 13830 1727204096.97603: exiting _queue_task() for managed-node3/include_tasks 13830 1727204096.97617: done queuing things up, now waiting for results queue to drain 13830 1727204096.97619: waiting for pending results... 13830 1727204096.97951: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv4_present.yml' 13830 1727204096.98315: in run() - task 0affcd87-79f5-1659-6b02-000000000402 13830 1727204096.98343: variable 'ansible_search_path' from source: unknown 13830 1727204096.98350: variable 'ansible_search_path' from source: unknown 13830 1727204096.98425: calling self._execute() 13830 1727204096.98745: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204096.98776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204096.98792: variable 'omit' from source: magic vars 13830 1727204096.99326: variable 'ansible_distribution_major_version' from source: facts 13830 1727204096.99339: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204096.99344: _execute() done 13830 1727204096.99348: dumping result to json 13830 1727204096.99352: done dumping result, returning 13830 1727204096.99358: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv4_present.yml' [0affcd87-79f5-1659-6b02-000000000402] 13830 1727204096.99366: sending task result for task 0affcd87-79f5-1659-6b02-000000000402 13830 1727204096.99516: done sending task result for task 0affcd87-79f5-1659-6b02-000000000402 13830 1727204096.99519: WORKER PROCESS EXITING 13830 1727204096.99550: no more pending results, returning what we have 13830 1727204096.99556: in VariableManager get_vars() 13830 1727204096.99612: Calling all_inventory to load vars for managed-node3 13830 1727204096.99616: Calling groups_inventory to load vars for managed-node3 13830 1727204096.99623: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204096.99653: Calling all_plugins_play to load vars for managed-node3 13830 1727204096.99657: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204096.99660: Calling groups_plugins_play to load vars for managed-node3 13830 1727204097.00871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204097.02919: done with get_vars() 13830 1727204097.02942: variable 'ansible_search_path' from source: unknown 13830 1727204097.02943: variable 'ansible_search_path' from source: unknown 13830 1727204097.02952: variable 'item' from source: include params 13830 1727204097.03037: variable 'item' from source: include params 13830 1727204097.03095: we have included files to process 13830 1727204097.03099: generating all_blocks data 13830 1727204097.03102: done generating all_blocks data 13830 1727204097.03110: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 13830 1727204097.03114: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 13830 1727204097.03119: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 13830 1727204097.03551: done processing included file 13830 1727204097.03555: iterating over new_blocks loaded from include file 13830 1727204097.03558: in VariableManager get_vars() 13830 1727204097.03591: done with get_vars() 13830 1727204097.03595: filtering new block on tags 13830 1727204097.03647: done filtering new block on tags 13830 1727204097.03652: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed-node3 13830 1727204097.03660: extending task lists for all hosts with included blocks 13830 1727204097.04056: done extending task lists 13830 1727204097.04059: done processing included files 13830 1727204097.04060: results queue empty 13830 1727204097.04061: checking for any_errors_fatal 13830 1727204097.04089: done checking for any_errors_fatal 13830 1727204097.04090: checking for max_fail_percentage 13830 1727204097.04093: done checking for max_fail_percentage 13830 1727204097.04094: checking to see if all hosts have failed and the running result is not ok 13830 1727204097.04095: done checking to see if all hosts have failed 13830 1727204097.04095: getting the remaining hosts for this loop 13830 1727204097.04097: done getting the remaining hosts for this loop 13830 1727204097.04102: getting the next task for host managed-node3 13830 1727204097.04109: done getting next task for host managed-node3 13830 1727204097.04113: ^ task is: TASK: ** TEST check IPv4 13830 1727204097.04117: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204097.04122: getting variables 13830 1727204097.04123: in VariableManager get_vars() 13830 1727204097.04139: Calling all_inventory to load vars for managed-node3 13830 1727204097.04142: Calling groups_inventory to load vars for managed-node3 13830 1727204097.04148: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204097.04154: Calling all_plugins_play to load vars for managed-node3 13830 1727204097.04156: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204097.04160: Calling groups_plugins_play to load vars for managed-node3 13830 1727204097.05595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204097.07084: done with get_vars() 13830 1727204097.07118: done getting variables 13830 1727204097.07174: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.099) 0:00:30.150 ***** 13830 1727204097.07207: entering _queue_task() for managed-node3/command 13830 1727204097.07549: worker is 1 (out of 1 available) 13830 1727204097.07561: exiting _queue_task() for managed-node3/command 13830 1727204097.07574: done queuing things up, now waiting for results queue to drain 13830 1727204097.07575: waiting for pending results... 13830 1727204097.07860: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 13830 1727204097.08011: in run() - task 0affcd87-79f5-1659-6b02-000000000631 13830 1727204097.08036: variable 'ansible_search_path' from source: unknown 13830 1727204097.08043: variable 'ansible_search_path' from source: unknown 13830 1727204097.08088: calling self._execute() 13830 1727204097.08198: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204097.08211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204097.08224: variable 'omit' from source: magic vars 13830 1727204097.08595: variable 'ansible_distribution_major_version' from source: facts 13830 1727204097.08614: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204097.08625: variable 'omit' from source: magic vars 13830 1727204097.08696: variable 'omit' from source: magic vars 13830 1727204097.08872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204097.10571: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204097.10618: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204097.10648: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204097.10675: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204097.10695: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204097.10765: variable 'interface' from source: include params 13830 1727204097.10769: variable 'controller_device' from source: play vars 13830 1727204097.10822: variable 'controller_device' from source: play vars 13830 1727204097.10843: variable 'omit' from source: magic vars 13830 1727204097.10868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204097.10891: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204097.10906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204097.10920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204097.10928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204097.10955: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204097.10958: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204097.10961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204097.11026: Set connection var ansible_connection to ssh 13830 1727204097.11034: Set connection var ansible_timeout to 10 13830 1727204097.11043: Set connection var ansible_shell_executable to /bin/sh 13830 1727204097.11050: Set connection var ansible_shell_type to sh 13830 1727204097.11055: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204097.11062: Set connection var ansible_pipelining to False 13830 1727204097.11089: variable 'ansible_shell_executable' from source: unknown 13830 1727204097.11093: variable 'ansible_connection' from source: unknown 13830 1727204097.11095: variable 'ansible_module_compression' from source: unknown 13830 1727204097.11098: variable 'ansible_shell_type' from source: unknown 13830 1727204097.11100: variable 'ansible_shell_executable' from source: unknown 13830 1727204097.11102: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204097.11104: variable 'ansible_pipelining' from source: unknown 13830 1727204097.11108: variable 'ansible_timeout' from source: unknown 13830 1727204097.11112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204097.11196: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204097.11206: variable 'omit' from source: magic vars 13830 1727204097.11211: starting attempt loop 13830 1727204097.11214: running the handler 13830 1727204097.11226: _low_level_execute_command(): starting 13830 1727204097.11234: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204097.11907: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 13830 1727204097.11929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.11947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.12023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.13663: stdout chunk (state=3): >>>/root <<< 13830 1727204097.13782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.13897: stderr chunk (state=3): >>><<< 13830 1727204097.13914: stdout chunk (state=3): >>><<< 13830 1727204097.13954: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204097.13981: _low_level_execute_command(): starting 13830 1727204097.13993: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767 `" && echo ansible-tmp-1727204097.1396115-15839-231757024260767="` echo /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767 `" ) && sleep 0' 13830 1727204097.15295: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.15299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.15327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204097.15330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204097.15333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.15335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.15397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.15409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.15478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.17450: stdout chunk (state=3): >>>ansible-tmp-1727204097.1396115-15839-231757024260767=/root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767 <<< 13830 1727204097.17557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.17645: stderr chunk (state=3): >>><<< 13830 1727204097.17649: stdout chunk (state=3): >>><<< 13830 1727204097.17909: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204097.1396115-15839-231757024260767=/root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204097.17913: variable 'ansible_module_compression' from source: unknown 13830 1727204097.17915: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204097.17917: variable 'ansible_facts' from source: unknown 13830 1727204097.17932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767/AnsiballZ_command.py 13830 1727204097.18128: Sending initial data 13830 1727204097.18131: Sent initial data (156 bytes) 13830 1727204097.19243: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.19262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.19280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.19308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.19352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.19367: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.19385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.19416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.19430: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.19443: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.19456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.19475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.19492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.19506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.19525: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.19541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.19621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.19653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.19673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.19761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.21666: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204097.21711: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204097.21749: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpg8vyajkr /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767/AnsiballZ_command.py <<< 13830 1727204097.22077: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204097.23535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.23663: stderr chunk (state=3): >>><<< 13830 1727204097.23668: stdout chunk (state=3): >>><<< 13830 1727204097.23671: done transferring module to remote 13830 1727204097.23672: _low_level_execute_command(): starting 13830 1727204097.23675: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767/ /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767/AnsiballZ_command.py && sleep 0' 13830 1727204097.24541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.24557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.24576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.24601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.24649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.24661: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.24681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.24699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.24711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.24722: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.24737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.24754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.24773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.24790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.24802: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.24816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.24900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.24922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.24939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.25023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.26973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.26977: stdout chunk (state=3): >>><<< 13830 1727204097.26979: stderr chunk (state=3): >>><<< 13830 1727204097.27077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204097.27081: _low_level_execute_command(): starting 13830 1727204097.27083: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767/AnsiballZ_command.py && sleep 0' 13830 1727204097.29158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.29177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.29193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.29213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.29287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.29301: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.29372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.29392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.29406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.29418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.29433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.29449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.29470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.29487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.29499: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.29514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.29689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.29720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.29741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.29837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.43610: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.149/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 230sec preferred_lft 230sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:57.431531", "end": "2024-09-24 14:54:57.435052", "delta": "0:00:00.003521", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204097.44779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204097.45817: stderr chunk (state=3): >>><<< 13830 1727204097.45822: stdout chunk (state=3): >>><<< 13830 1727204097.46013: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.149/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 230sec preferred_lft 230sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:57.431531", "end": "2024-09-24 14:54:57.435052", "delta": "0:00:00.003521", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204097.46025: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204097.46028: _low_level_execute_command(): starting 13830 1727204097.46032: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204097.1396115-15839-231757024260767/ > /dev/null 2>&1 && sleep 0' 13830 1727204097.47416: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.47430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.47446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.47582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.47699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.47714: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.47729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.47747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.47759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.47774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.47795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.47809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.47827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.47840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.47852: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.47871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.48051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.48078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.48097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.48189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.49995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.50061: stderr chunk (state=3): >>><<< 13830 1727204097.50067: stdout chunk (state=3): >>><<< 13830 1727204097.50278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204097.50282: handler run complete 13830 1727204097.50285: Evaluated conditional (False): False 13830 1727204097.50331: variable 'address' from source: include params 13830 1727204097.50343: variable 'result' from source: set_fact 13830 1727204097.50367: Evaluated conditional (address in result.stdout): True 13830 1727204097.50396: attempt loop complete, returning result 13830 1727204097.50404: _execute() done 13830 1727204097.50412: dumping result to json 13830 1727204097.50422: done dumping result, returning 13830 1727204097.50435: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 [0affcd87-79f5-1659-6b02-000000000631] 13830 1727204097.50444: sending task result for task 0affcd87-79f5-1659-6b02-000000000631 13830 1727204097.50586: done sending task result for task 0affcd87-79f5-1659-6b02-000000000631 ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003521", "end": "2024-09-24 14:54:57.435052", "rc": 0, "start": "2024-09-24 14:54:57.431531" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.149/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 230sec preferred_lft 230sec 13830 1727204097.50680: no more pending results, returning what we have 13830 1727204097.50686: results queue empty 13830 1727204097.50687: checking for any_errors_fatal 13830 1727204097.50689: done checking for any_errors_fatal 13830 1727204097.50690: checking for max_fail_percentage 13830 1727204097.50692: done checking for max_fail_percentage 13830 1727204097.50693: checking to see if all hosts have failed and the running result is not ok 13830 1727204097.50693: done checking to see if all hosts have failed 13830 1727204097.50694: getting the remaining hosts for this loop 13830 1727204097.50696: done getting the remaining hosts for this loop 13830 1727204097.50701: getting the next task for host managed-node3 13830 1727204097.50711: done getting next task for host managed-node3 13830 1727204097.50714: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 13830 1727204097.50717: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204097.50722: getting variables 13830 1727204097.50724: in VariableManager get_vars() 13830 1727204097.50767: Calling all_inventory to load vars for managed-node3 13830 1727204097.50770: Calling groups_inventory to load vars for managed-node3 13830 1727204097.50774: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204097.50790: Calling all_plugins_play to load vars for managed-node3 13830 1727204097.50792: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204097.50795: Calling groups_plugins_play to load vars for managed-node3 13830 1727204097.51751: WORKER PROCESS EXITING 13830 1727204097.52786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204097.54971: done with get_vars() 13830 1727204097.54996: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.478) 0:00:30.629 ***** 13830 1727204097.55101: entering _queue_task() for managed-node3/include_tasks 13830 1727204097.55451: worker is 1 (out of 1 available) 13830 1727204097.55467: exiting _queue_task() for managed-node3/include_tasks 13830 1727204097.55479: done queuing things up, now waiting for results queue to drain 13830 1727204097.55481: waiting for pending results... 13830 1727204097.55783: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv6_present.yml' 13830 1727204097.55909: in run() - task 0affcd87-79f5-1659-6b02-000000000403 13830 1727204097.55941: variable 'ansible_search_path' from source: unknown 13830 1727204097.55950: variable 'ansible_search_path' from source: unknown 13830 1727204097.55996: calling self._execute() 13830 1727204097.56103: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204097.56115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204097.56133: variable 'omit' from source: magic vars 13830 1727204097.56536: variable 'ansible_distribution_major_version' from source: facts 13830 1727204097.56555: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204097.56567: _execute() done 13830 1727204097.56578: dumping result to json 13830 1727204097.56586: done dumping result, returning 13830 1727204097.56596: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv6_present.yml' [0affcd87-79f5-1659-6b02-000000000403] 13830 1727204097.56606: sending task result for task 0affcd87-79f5-1659-6b02-000000000403 13830 1727204097.56748: no more pending results, returning what we have 13830 1727204097.56754: in VariableManager get_vars() 13830 1727204097.56799: Calling all_inventory to load vars for managed-node3 13830 1727204097.56802: Calling groups_inventory to load vars for managed-node3 13830 1727204097.56806: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204097.56822: Calling all_plugins_play to load vars for managed-node3 13830 1727204097.56825: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204097.56828: Calling groups_plugins_play to load vars for managed-node3 13830 1727204097.57882: done sending task result for task 0affcd87-79f5-1659-6b02-000000000403 13830 1727204097.57886: WORKER PROCESS EXITING 13830 1727204097.58528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204097.60181: done with get_vars() 13830 1727204097.60209: variable 'ansible_search_path' from source: unknown 13830 1727204097.60210: variable 'ansible_search_path' from source: unknown 13830 1727204097.60219: variable 'item' from source: include params 13830 1727204097.60332: variable 'item' from source: include params 13830 1727204097.60368: we have included files to process 13830 1727204097.60369: generating all_blocks data 13830 1727204097.60371: done generating all_blocks data 13830 1727204097.60376: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 13830 1727204097.60377: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 13830 1727204097.60379: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 13830 1727204097.60636: done processing included file 13830 1727204097.60639: iterating over new_blocks loaded from include file 13830 1727204097.60640: in VariableManager get_vars() 13830 1727204097.60659: done with get_vars() 13830 1727204097.60661: filtering new block on tags 13830 1727204097.60693: done filtering new block on tags 13830 1727204097.60696: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed-node3 13830 1727204097.60701: extending task lists for all hosts with included blocks 13830 1727204097.61081: done extending task lists 13830 1727204097.61082: done processing included files 13830 1727204097.61083: results queue empty 13830 1727204097.61084: checking for any_errors_fatal 13830 1727204097.61089: done checking for any_errors_fatal 13830 1727204097.61089: checking for max_fail_percentage 13830 1727204097.61090: done checking for max_fail_percentage 13830 1727204097.61091: checking to see if all hosts have failed and the running result is not ok 13830 1727204097.61092: done checking to see if all hosts have failed 13830 1727204097.61093: getting the remaining hosts for this loop 13830 1727204097.61094: done getting the remaining hosts for this loop 13830 1727204097.61097: getting the next task for host managed-node3 13830 1727204097.61102: done getting next task for host managed-node3 13830 1727204097.61104: ^ task is: TASK: ** TEST check IPv6 13830 1727204097.61107: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204097.61110: getting variables 13830 1727204097.61111: in VariableManager get_vars() 13830 1727204097.61121: Calling all_inventory to load vars for managed-node3 13830 1727204097.61123: Calling groups_inventory to load vars for managed-node3 13830 1727204097.61126: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204097.61134: Calling all_plugins_play to load vars for managed-node3 13830 1727204097.61137: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204097.61140: Calling groups_plugins_play to load vars for managed-node3 13830 1727204097.62814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204097.64480: done with get_vars() 13830 1727204097.64509: done getting variables 13830 1727204097.64563: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.094) 0:00:30.724 ***** 13830 1727204097.64599: entering _queue_task() for managed-node3/command 13830 1727204097.64958: worker is 1 (out of 1 available) 13830 1727204097.64972: exiting _queue_task() for managed-node3/command 13830 1727204097.64985: done queuing things up, now waiting for results queue to drain 13830 1727204097.64986: waiting for pending results... 13830 1727204097.65289: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 13830 1727204097.65445: in run() - task 0affcd87-79f5-1659-6b02-000000000652 13830 1727204097.65468: variable 'ansible_search_path' from source: unknown 13830 1727204097.65477: variable 'ansible_search_path' from source: unknown 13830 1727204097.65517: calling self._execute() 13830 1727204097.65622: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204097.65636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204097.65654: variable 'omit' from source: magic vars 13830 1727204097.66022: variable 'ansible_distribution_major_version' from source: facts 13830 1727204097.66043: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204097.66052: variable 'omit' from source: magic vars 13830 1727204097.66113: variable 'omit' from source: magic vars 13830 1727204097.66273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204097.68744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204097.68824: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204097.68873: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204097.68913: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204097.68955: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204097.69052: variable 'controller_device' from source: play vars 13830 1727204097.69460: variable 'omit' from source: magic vars 13830 1727204097.69500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204097.69540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204097.69567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204097.69589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204097.69602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204097.69640: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204097.69653: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204097.69661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204097.69761: Set connection var ansible_connection to ssh 13830 1727204097.69782: Set connection var ansible_timeout to 10 13830 1727204097.69793: Set connection var ansible_shell_executable to /bin/sh 13830 1727204097.69799: Set connection var ansible_shell_type to sh 13830 1727204097.69808: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204097.69820: Set connection var ansible_pipelining to False 13830 1727204097.69851: variable 'ansible_shell_executable' from source: unknown 13830 1727204097.69862: variable 'ansible_connection' from source: unknown 13830 1727204097.69874: variable 'ansible_module_compression' from source: unknown 13830 1727204097.69882: variable 'ansible_shell_type' from source: unknown 13830 1727204097.69888: variable 'ansible_shell_executable' from source: unknown 13830 1727204097.69895: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204097.69901: variable 'ansible_pipelining' from source: unknown 13830 1727204097.69906: variable 'ansible_timeout' from source: unknown 13830 1727204097.69912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204097.70023: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204097.70043: variable 'omit' from source: magic vars 13830 1727204097.70053: starting attempt loop 13830 1727204097.70058: running the handler 13830 1727204097.70080: _low_level_execute_command(): starting 13830 1727204097.70096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204097.70825: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.70843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.70861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.70883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.70924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.70938: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.70954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.70977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.70989: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.71000: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.71010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.71022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.71042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.71054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.71067: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.71093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.71177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.71218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.71236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.71416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.72947: stdout chunk (state=3): >>>/root <<< 13830 1727204097.73137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.73141: stdout chunk (state=3): >>><<< 13830 1727204097.73148: stderr chunk (state=3): >>><<< 13830 1727204097.73592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204097.73603: _low_level_execute_command(): starting 13830 1727204097.73611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706 `" && echo ansible-tmp-1727204097.7359061-15878-262085582334706="` echo /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706 `" ) && sleep 0' 13830 1727204097.76371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.76486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.76496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.76510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.76559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.76567: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.76579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.76651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.76658: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.76666: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.76675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.76684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.76696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.76703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.76710: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.76720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.76884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.76904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.76916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.76997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.78987: stdout chunk (state=3): >>>ansible-tmp-1727204097.7359061-15878-262085582334706=/root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706 <<< 13830 1727204097.79169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.79173: stdout chunk (state=3): >>><<< 13830 1727204097.79180: stderr chunk (state=3): >>><<< 13830 1727204097.79203: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204097.7359061-15878-262085582334706=/root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204097.79235: variable 'ansible_module_compression' from source: unknown 13830 1727204097.79289: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204097.79322: variable 'ansible_facts' from source: unknown 13830 1727204097.79399: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706/AnsiballZ_command.py 13830 1727204097.80010: Sending initial data 13830 1727204097.80017: Sent initial data (156 bytes) 13830 1727204097.83741: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.83750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.83761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.83778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.83819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.83828: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.83842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.83855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.83863: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.83868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.83876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.83885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.83897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.83904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.83910: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.83920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.83993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.84014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.84026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.84106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.85985: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204097.86016: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204097.86055: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpl281pr3l /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706/AnsiballZ_command.py <<< 13830 1727204097.86094: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204097.87544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.87636: stderr chunk (state=3): >>><<< 13830 1727204097.87639: stdout chunk (state=3): >>><<< 13830 1727204097.87670: done transferring module to remote 13830 1727204097.87673: _low_level_execute_command(): starting 13830 1727204097.87676: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706/ /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706/AnsiballZ_command.py && sleep 0' 13830 1727204097.89158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.89284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.89295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.89309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.89348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.89356: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.89368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.89383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.89394: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.89400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.89408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.89417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.89428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.89436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.89443: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.89452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.89638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.89658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.89673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.89766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204097.91716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204097.91720: stdout chunk (state=3): >>><<< 13830 1727204097.91725: stderr chunk (state=3): >>><<< 13830 1727204097.91742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204097.91746: _low_level_execute_command(): starting 13830 1727204097.91749: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706/AnsiballZ_command.py && sleep 0' 13830 1727204097.94298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204097.94616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.94625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.94641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.94681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.94688: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204097.94698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.94713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204097.94729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204097.94734: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204097.94744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204097.94752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204097.94762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204097.94778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204097.94785: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204097.94794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204097.94876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204097.94891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204097.94903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204097.94995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204098.08404: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1aa/128 scope global dynamic noprefixroute \n valid_lft 230sec preferred_lft 230sec\n inet6 2001:db8::1624:aefc:eeaa:c36e/64 scope global dynamic noprefixroute \n valid_lft 1790sec preferred_lft 1790sec\n inet6 fe80::9d8b:f9cc:2acd:1842/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:58.079549", "end": "2024-09-24 14:54:58.082805", "delta": "0:00:00.003256", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204098.09628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204098.09632: stdout chunk (state=3): >>><<< 13830 1727204098.09635: stderr chunk (state=3): >>><<< 13830 1727204098.09784: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1aa/128 scope global dynamic noprefixroute \n valid_lft 230sec preferred_lft 230sec\n inet6 2001:db8::1624:aefc:eeaa:c36e/64 scope global dynamic noprefixroute \n valid_lft 1790sec preferred_lft 1790sec\n inet6 fe80::9d8b:f9cc:2acd:1842/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:58.079549", "end": "2024-09-24 14:54:58.082805", "delta": "0:00:00.003256", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204098.09793: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204098.09796: _low_level_execute_command(): starting 13830 1727204098.09798: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204097.7359061-15878-262085582334706/ > /dev/null 2>&1 && sleep 0' 13830 1727204098.10452: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204098.10475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204098.10495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204098.10515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204098.10575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204098.10590: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204098.10615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204098.10642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204098.10665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204098.10678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204098.10691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204098.10705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204098.10726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204098.10742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204098.10759: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204098.10780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204098.10977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204098.11251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204098.11268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204098.13114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204098.13168: stderr chunk (state=3): >>><<< 13830 1727204098.13171: stdout chunk (state=3): >>><<< 13830 1727204098.13271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204098.13275: handler run complete 13830 1727204098.13283: Evaluated conditional (False): False 13830 1727204098.13573: variable 'address' from source: include params 13830 1727204098.13585: variable 'result' from source: set_fact 13830 1727204098.13620: Evaluated conditional (address in result.stdout): True 13830 1727204098.13638: attempt loop complete, returning result 13830 1727204098.13646: _execute() done 13830 1727204098.13652: dumping result to json 13830 1727204098.13661: done dumping result, returning 13830 1727204098.13677: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 [0affcd87-79f5-1659-6b02-000000000652] 13830 1727204098.13686: sending task result for task 0affcd87-79f5-1659-6b02-000000000652 ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003256", "end": "2024-09-24 14:54:58.082805", "rc": 0, "start": "2024-09-24 14:54:58.079549" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1aa/128 scope global dynamic noprefixroute valid_lft 230sec preferred_lft 230sec inet6 2001:db8::1624:aefc:eeaa:c36e/64 scope global dynamic noprefixroute valid_lft 1790sec preferred_lft 1790sec inet6 fe80::9d8b:f9cc:2acd:1842/64 scope link noprefixroute valid_lft forever preferred_lft forever 13830 1727204098.13915: no more pending results, returning what we have 13830 1727204098.13919: results queue empty 13830 1727204098.13920: checking for any_errors_fatal 13830 1727204098.13922: done checking for any_errors_fatal 13830 1727204098.13923: checking for max_fail_percentage 13830 1727204098.13925: done checking for max_fail_percentage 13830 1727204098.13926: checking to see if all hosts have failed and the running result is not ok 13830 1727204098.13926: done checking to see if all hosts have failed 13830 1727204098.13927: getting the remaining hosts for this loop 13830 1727204098.13931: done getting the remaining hosts for this loop 13830 1727204098.13935: getting the next task for host managed-node3 13830 1727204098.13946: done getting next task for host managed-node3 13830 1727204098.13949: ^ task is: TASK: Conditional asserts 13830 1727204098.13952: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204098.13958: getting variables 13830 1727204098.13959: in VariableManager get_vars() 13830 1727204098.13998: Calling all_inventory to load vars for managed-node3 13830 1727204098.14001: Calling groups_inventory to load vars for managed-node3 13830 1727204098.14005: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.14017: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.14020: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.14023: Calling groups_plugins_play to load vars for managed-node3 13830 1727204098.14903: done sending task result for task 0affcd87-79f5-1659-6b02-000000000652 13830 1727204098.14907: WORKER PROCESS EXITING 13830 1727204098.18549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204098.21467: done with get_vars() 13830 1727204098.21512: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.570) 0:00:31.294 ***** 13830 1727204098.21626: entering _queue_task() for managed-node3/include_tasks 13830 1727204098.22087: worker is 1 (out of 1 available) 13830 1727204098.22101: exiting _queue_task() for managed-node3/include_tasks 13830 1727204098.22122: done queuing things up, now waiting for results queue to drain 13830 1727204098.22123: waiting for pending results... 13830 1727204098.22480: running TaskExecutor() for managed-node3/TASK: Conditional asserts 13830 1727204098.22722: in run() - task 0affcd87-79f5-1659-6b02-00000000008e 13830 1727204098.22766: variable 'ansible_search_path' from source: unknown 13830 1727204098.22777: variable 'ansible_search_path' from source: unknown 13830 1727204098.23324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204098.25870: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204098.26477: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204098.26535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204098.26579: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204098.26617: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204098.26723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204098.26760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204098.26798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204098.26857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204098.26887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204098.27074: dumping result to json 13830 1727204098.27084: done dumping result, returning 13830 1727204098.27096: done running TaskExecutor() for managed-node3/TASK: Conditional asserts [0affcd87-79f5-1659-6b02-00000000008e] 13830 1727204098.27106: sending task result for task 0affcd87-79f5-1659-6b02-00000000008e skipping: [managed-node3] => { "changed": false, "skipped_reason": "No items in the list" } 13830 1727204098.27290: no more pending results, returning what we have 13830 1727204098.27294: results queue empty 13830 1727204098.27295: checking for any_errors_fatal 13830 1727204098.27304: done checking for any_errors_fatal 13830 1727204098.27305: checking for max_fail_percentage 13830 1727204098.27307: done checking for max_fail_percentage 13830 1727204098.27308: checking to see if all hosts have failed and the running result is not ok 13830 1727204098.27309: done checking to see if all hosts have failed 13830 1727204098.27310: getting the remaining hosts for this loop 13830 1727204098.27312: done getting the remaining hosts for this loop 13830 1727204098.27317: getting the next task for host managed-node3 13830 1727204098.27325: done getting next task for host managed-node3 13830 1727204098.27328: ^ task is: TASK: Success in test '{{ lsr_description }}' 13830 1727204098.27335: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204098.27340: getting variables 13830 1727204098.27342: in VariableManager get_vars() 13830 1727204098.27381: Calling all_inventory to load vars for managed-node3 13830 1727204098.27384: Calling groups_inventory to load vars for managed-node3 13830 1727204098.27388: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.27399: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.27402: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.27405: Calling groups_plugins_play to load vars for managed-node3 13830 1727204098.29000: done sending task result for task 0affcd87-79f5-1659-6b02-00000000008e 13830 1727204098.29005: WORKER PROCESS EXITING 13830 1727204098.29978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204098.31599: done with get_vars() 13830 1727204098.31634: done getting variables 13830 1727204098.31696: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204098.31821: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.102) 0:00:31.396 ***** 13830 1727204098.31857: entering _queue_task() for managed-node3/debug 13830 1727204098.32201: worker is 1 (out of 1 available) 13830 1727204098.32214: exiting _queue_task() for managed-node3/debug 13830 1727204098.32226: done queuing things up, now waiting for results queue to drain 13830 1727204098.32227: waiting for pending results... 13830 1727204098.32523: running TaskExecutor() for managed-node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 13830 1727204098.32647: in run() - task 0affcd87-79f5-1659-6b02-00000000008f 13830 1727204098.32676: variable 'ansible_search_path' from source: unknown 13830 1727204098.32685: variable 'ansible_search_path' from source: unknown 13830 1727204098.32725: calling self._execute() 13830 1727204098.32829: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204098.32844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204098.32856: variable 'omit' from source: magic vars 13830 1727204098.33240: variable 'ansible_distribution_major_version' from source: facts 13830 1727204098.33259: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204098.33272: variable 'omit' from source: magic vars 13830 1727204098.33314: variable 'omit' from source: magic vars 13830 1727204098.33421: variable 'lsr_description' from source: include params 13830 1727204098.33452: variable 'omit' from source: magic vars 13830 1727204098.33501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204098.33549: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204098.33578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204098.33600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204098.33617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204098.33659: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204098.33669: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204098.33677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204098.33787: Set connection var ansible_connection to ssh 13830 1727204098.33803: Set connection var ansible_timeout to 10 13830 1727204098.33814: Set connection var ansible_shell_executable to /bin/sh 13830 1727204098.33821: Set connection var ansible_shell_type to sh 13830 1727204098.33833: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204098.33848: Set connection var ansible_pipelining to False 13830 1727204098.33882: variable 'ansible_shell_executable' from source: unknown 13830 1727204098.33890: variable 'ansible_connection' from source: unknown 13830 1727204098.33897: variable 'ansible_module_compression' from source: unknown 13830 1727204098.33903: variable 'ansible_shell_type' from source: unknown 13830 1727204098.33910: variable 'ansible_shell_executable' from source: unknown 13830 1727204098.33916: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204098.33923: variable 'ansible_pipelining' from source: unknown 13830 1727204098.33929: variable 'ansible_timeout' from source: unknown 13830 1727204098.33941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204098.34092: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204098.34111: variable 'omit' from source: magic vars 13830 1727204098.34120: starting attempt loop 13830 1727204098.34126: running the handler 13830 1727204098.34180: handler run complete 13830 1727204098.34203: attempt loop complete, returning result 13830 1727204098.34210: _execute() done 13830 1727204098.34215: dumping result to json 13830 1727204098.34221: done dumping result, returning 13830 1727204098.34235: done running TaskExecutor() for managed-node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0affcd87-79f5-1659-6b02-00000000008f] 13830 1727204098.34246: sending task result for task 0affcd87-79f5-1659-6b02-00000000008f ok: [managed-node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 13830 1727204098.34394: no more pending results, returning what we have 13830 1727204098.34398: results queue empty 13830 1727204098.34399: checking for any_errors_fatal 13830 1727204098.34407: done checking for any_errors_fatal 13830 1727204098.34408: checking for max_fail_percentage 13830 1727204098.34411: done checking for max_fail_percentage 13830 1727204098.34412: checking to see if all hosts have failed and the running result is not ok 13830 1727204098.34412: done checking to see if all hosts have failed 13830 1727204098.34413: getting the remaining hosts for this loop 13830 1727204098.34415: done getting the remaining hosts for this loop 13830 1727204098.34420: getting the next task for host managed-node3 13830 1727204098.34429: done getting next task for host managed-node3 13830 1727204098.34436: ^ task is: TASK: Cleanup 13830 1727204098.34439: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204098.34444: getting variables 13830 1727204098.34446: in VariableManager get_vars() 13830 1727204098.34484: Calling all_inventory to load vars for managed-node3 13830 1727204098.34488: Calling groups_inventory to load vars for managed-node3 13830 1727204098.34492: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.34503: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.34506: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.34509: Calling groups_plugins_play to load vars for managed-node3 13830 1727204098.35483: done sending task result for task 0affcd87-79f5-1659-6b02-00000000008f 13830 1727204098.35486: WORKER PROCESS EXITING 13830 1727204098.36666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204098.40368: done with get_vars() 13830 1727204098.40402: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.087) 0:00:31.484 ***** 13830 1727204098.40619: entering _queue_task() for managed-node3/include_tasks 13830 1727204098.41277: worker is 1 (out of 1 available) 13830 1727204098.41289: exiting _queue_task() for managed-node3/include_tasks 13830 1727204098.41301: done queuing things up, now waiting for results queue to drain 13830 1727204098.41303: waiting for pending results... 13830 1727204098.42252: running TaskExecutor() for managed-node3/TASK: Cleanup 13830 1727204098.42482: in run() - task 0affcd87-79f5-1659-6b02-000000000093 13830 1727204098.42537: variable 'ansible_search_path' from source: unknown 13830 1727204098.42541: variable 'ansible_search_path' from source: unknown 13830 1727204098.42593: variable 'lsr_cleanup' from source: include params 13830 1727204098.43246: variable 'lsr_cleanup' from source: include params 13830 1727204098.43413: variable 'omit' from source: magic vars 13830 1727204098.43802: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204098.43872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204098.43887: variable 'omit' from source: magic vars 13830 1727204098.44278: variable 'ansible_distribution_major_version' from source: facts 13830 1727204098.44294: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204098.44306: variable 'item' from source: unknown 13830 1727204098.44378: variable 'item' from source: unknown 13830 1727204098.44422: variable 'item' from source: unknown 13830 1727204098.44490: variable 'item' from source: unknown 13830 1727204098.44696: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204098.44708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204098.44723: variable 'omit' from source: magic vars 13830 1727204098.44883: variable 'ansible_distribution_major_version' from source: facts 13830 1727204098.44893: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204098.44902: variable 'item' from source: unknown 13830 1727204098.44972: variable 'item' from source: unknown 13830 1727204098.45006: variable 'item' from source: unknown 13830 1727204098.45074: variable 'item' from source: unknown 13830 1727204098.45160: dumping result to json 13830 1727204098.45174: done dumping result, returning 13830 1727204098.45188: done running TaskExecutor() for managed-node3/TASK: Cleanup [0affcd87-79f5-1659-6b02-000000000093] 13830 1727204098.45201: sending task result for task 0affcd87-79f5-1659-6b02-000000000093 13830 1727204098.45280: done sending task result for task 0affcd87-79f5-1659-6b02-000000000093 13830 1727204098.45313: no more pending results, returning what we have 13830 1727204098.45319: in VariableManager get_vars() 13830 1727204098.45365: Calling all_inventory to load vars for managed-node3 13830 1727204098.45369: Calling groups_inventory to load vars for managed-node3 13830 1727204098.45373: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.45387: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.45391: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.45394: Calling groups_plugins_play to load vars for managed-node3 13830 1727204098.46484: WORKER PROCESS EXITING 13830 1727204098.60278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204098.63298: done with get_vars() 13830 1727204098.63329: variable 'ansible_search_path' from source: unknown 13830 1727204098.63333: variable 'ansible_search_path' from source: unknown 13830 1727204098.64618: variable 'ansible_search_path' from source: unknown 13830 1727204098.64620: variable 'ansible_search_path' from source: unknown 13830 1727204098.64657: we have included files to process 13830 1727204098.64658: generating all_blocks data 13830 1727204098.64660: done generating all_blocks data 13830 1727204098.64667: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 13830 1727204098.64668: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 13830 1727204098.64671: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 13830 1727204098.65115: in VariableManager get_vars() 13830 1727204098.65141: done with get_vars() 13830 1727204098.65147: variable 'omit' from source: magic vars 13830 1727204098.65190: variable 'omit' from source: magic vars 13830 1727204098.65252: in VariableManager get_vars() 13830 1727204098.65268: done with get_vars() 13830 1727204098.65294: in VariableManager get_vars() 13830 1727204098.65310: done with get_vars() 13830 1727204098.65349: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13830 1727204098.66281: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13830 1727204098.66373: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13830 1727204098.66804: in VariableManager get_vars() 13830 1727204098.66835: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204098.69737: done processing included file 13830 1727204098.69739: iterating over new_blocks loaded from include file 13830 1727204098.69740: in VariableManager get_vars() 13830 1727204098.69933: done with get_vars() 13830 1727204098.69935: filtering new block on tags 13830 1727204098.70395: done filtering new block on tags 13830 1727204098.70400: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed-node3 => (item=tasks/cleanup_bond_profile+device.yml) 13830 1727204098.70405: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13830 1727204098.70406: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13830 1727204098.70411: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13830 1727204098.70968: done processing included file 13830 1727204098.70970: iterating over new_blocks loaded from include file 13830 1727204098.70972: in VariableManager get_vars() 13830 1727204098.70991: done with get_vars() 13830 1727204098.70993: filtering new block on tags 13830 1727204098.71024: done filtering new block on tags 13830 1727204098.71027: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed-node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 13830 1727204098.71034: extending task lists for all hosts with included blocks 13830 1727204098.73859: done extending task lists 13830 1727204098.73861: done processing included files 13830 1727204098.73862: results queue empty 13830 1727204098.73865: checking for any_errors_fatal 13830 1727204098.73869: done checking for any_errors_fatal 13830 1727204098.73870: checking for max_fail_percentage 13830 1727204098.73871: done checking for max_fail_percentage 13830 1727204098.73873: checking to see if all hosts have failed and the running result is not ok 13830 1727204098.73874: done checking to see if all hosts have failed 13830 1727204098.73874: getting the remaining hosts for this loop 13830 1727204098.73876: done getting the remaining hosts for this loop 13830 1727204098.73879: getting the next task for host managed-node3 13830 1727204098.73885: done getting next task for host managed-node3 13830 1727204098.73893: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204098.73896: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204098.73907: getting variables 13830 1727204098.73909: in VariableManager get_vars() 13830 1727204098.73928: Calling all_inventory to load vars for managed-node3 13830 1727204098.73933: Calling groups_inventory to load vars for managed-node3 13830 1727204098.73936: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.73941: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.73943: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.73946: Calling groups_plugins_play to load vars for managed-node3 13830 1727204098.75751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204098.79008: done with get_vars() 13830 1727204098.79039: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.385) 0:00:31.869 ***** 13830 1727204098.79130: entering _queue_task() for managed-node3/include_tasks 13830 1727204098.79493: worker is 1 (out of 1 available) 13830 1727204098.79505: exiting _queue_task() for managed-node3/include_tasks 13830 1727204098.79518: done queuing things up, now waiting for results queue to drain 13830 1727204098.79520: waiting for pending results... 13830 1727204098.79846: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204098.80007: in run() - task 0affcd87-79f5-1659-6b02-000000000693 13830 1727204098.80015: variable 'ansible_search_path' from source: unknown 13830 1727204098.80019: variable 'ansible_search_path' from source: unknown 13830 1727204098.80056: calling self._execute() 13830 1727204098.80154: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204098.80159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204098.80174: variable 'omit' from source: magic vars 13830 1727204098.80571: variable 'ansible_distribution_major_version' from source: facts 13830 1727204098.80586: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204098.80589: _execute() done 13830 1727204098.80592: dumping result to json 13830 1727204098.80594: done dumping result, returning 13830 1727204098.80602: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1659-6b02-000000000693] 13830 1727204098.80608: sending task result for task 0affcd87-79f5-1659-6b02-000000000693 13830 1727204098.80712: done sending task result for task 0affcd87-79f5-1659-6b02-000000000693 13830 1727204098.80714: WORKER PROCESS EXITING 13830 1727204098.80770: no more pending results, returning what we have 13830 1727204098.80776: in VariableManager get_vars() 13830 1727204098.80828: Calling all_inventory to load vars for managed-node3 13830 1727204098.80832: Calling groups_inventory to load vars for managed-node3 13830 1727204098.80834: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.80848: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.80851: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.80854: Calling groups_plugins_play to load vars for managed-node3 13830 1727204098.82499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204098.85213: done with get_vars() 13830 1727204098.85291: variable 'ansible_search_path' from source: unknown 13830 1727204098.85293: variable 'ansible_search_path' from source: unknown 13830 1727204098.85457: we have included files to process 13830 1727204098.85459: generating all_blocks data 13830 1727204098.85461: done generating all_blocks data 13830 1727204098.85465: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204098.85467: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204098.85470: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204098.86638: done processing included file 13830 1727204098.86640: iterating over new_blocks loaded from include file 13830 1727204098.86642: in VariableManager get_vars() 13830 1727204098.86739: done with get_vars() 13830 1727204098.86741: filtering new block on tags 13830 1727204098.86779: done filtering new block on tags 13830 1727204098.86782: in VariableManager get_vars() 13830 1727204098.86808: done with get_vars() 13830 1727204098.86810: filtering new block on tags 13830 1727204098.86981: done filtering new block on tags 13830 1727204098.86984: in VariableManager get_vars() 13830 1727204098.87009: done with get_vars() 13830 1727204098.87011: filtering new block on tags 13830 1727204098.87174: done filtering new block on tags 13830 1727204098.87177: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 13830 1727204098.87182: extending task lists for all hosts with included blocks 13830 1727204098.89216: done extending task lists 13830 1727204098.89217: done processing included files 13830 1727204098.89218: results queue empty 13830 1727204098.89219: checking for any_errors_fatal 13830 1727204098.89224: done checking for any_errors_fatal 13830 1727204098.89225: checking for max_fail_percentage 13830 1727204098.89231: done checking for max_fail_percentage 13830 1727204098.89232: checking to see if all hosts have failed and the running result is not ok 13830 1727204098.89233: done checking to see if all hosts have failed 13830 1727204098.89233: getting the remaining hosts for this loop 13830 1727204098.89235: done getting the remaining hosts for this loop 13830 1727204098.89238: getting the next task for host managed-node3 13830 1727204098.89243: done getting next task for host managed-node3 13830 1727204098.89246: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204098.89251: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204098.89262: getting variables 13830 1727204098.89263: in VariableManager get_vars() 13830 1727204098.89284: Calling all_inventory to load vars for managed-node3 13830 1727204098.89286: Calling groups_inventory to load vars for managed-node3 13830 1727204098.89289: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.89295: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.89297: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.89300: Calling groups_plugins_play to load vars for managed-node3 13830 1727204098.90590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204098.92305: done with get_vars() 13830 1727204098.92334: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.132) 0:00:32.002 ***** 13830 1727204098.92431: entering _queue_task() for managed-node3/setup 13830 1727204098.92790: worker is 1 (out of 1 available) 13830 1727204098.92810: exiting _queue_task() for managed-node3/setup 13830 1727204098.92823: done queuing things up, now waiting for results queue to drain 13830 1727204098.92825: waiting for pending results... 13830 1727204098.93136: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204098.93312: in run() - task 0affcd87-79f5-1659-6b02-0000000007c9 13830 1727204098.93318: variable 'ansible_search_path' from source: unknown 13830 1727204098.93322: variable 'ansible_search_path' from source: unknown 13830 1727204098.93366: calling self._execute() 13830 1727204098.93457: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204098.93461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204098.93472: variable 'omit' from source: magic vars 13830 1727204098.93869: variable 'ansible_distribution_major_version' from source: facts 13830 1727204098.93887: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204098.94114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204098.96561: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204098.96634: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204098.96676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204098.96713: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204098.96740: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204098.96827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204098.96855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204098.96891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204098.96936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204098.96952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204098.97008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204098.97036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204098.97061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204098.97107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204098.97122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204098.97324: variable '__network_required_facts' from source: role '' defaults 13830 1727204098.97335: variable 'ansible_facts' from source: unknown 13830 1727204098.98276: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13830 1727204098.98280: when evaluation is False, skipping this task 13830 1727204098.98284: _execute() done 13830 1727204098.98286: dumping result to json 13830 1727204098.98288: done dumping result, returning 13830 1727204098.98296: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1659-6b02-0000000007c9] 13830 1727204098.98301: sending task result for task 0affcd87-79f5-1659-6b02-0000000007c9 13830 1727204098.98407: done sending task result for task 0affcd87-79f5-1659-6b02-0000000007c9 13830 1727204098.98411: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204098.98465: no more pending results, returning what we have 13830 1727204098.98469: results queue empty 13830 1727204098.98470: checking for any_errors_fatal 13830 1727204098.98472: done checking for any_errors_fatal 13830 1727204098.98473: checking for max_fail_percentage 13830 1727204098.98475: done checking for max_fail_percentage 13830 1727204098.98476: checking to see if all hosts have failed and the running result is not ok 13830 1727204098.98476: done checking to see if all hosts have failed 13830 1727204098.98477: getting the remaining hosts for this loop 13830 1727204098.98479: done getting the remaining hosts for this loop 13830 1727204098.98483: getting the next task for host managed-node3 13830 1727204098.98496: done getting next task for host managed-node3 13830 1727204098.98500: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204098.98507: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204098.98525: getting variables 13830 1727204098.98527: in VariableManager get_vars() 13830 1727204098.98574: Calling all_inventory to load vars for managed-node3 13830 1727204098.98577: Calling groups_inventory to load vars for managed-node3 13830 1727204098.98580: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204098.98591: Calling all_plugins_play to load vars for managed-node3 13830 1727204098.98594: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204098.98604: Calling groups_plugins_play to load vars for managed-node3 13830 1727204099.00455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204099.02269: done with get_vars() 13830 1727204099.02300: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.099) 0:00:32.102 ***** 13830 1727204099.02415: entering _queue_task() for managed-node3/stat 13830 1727204099.02938: worker is 1 (out of 1 available) 13830 1727204099.02953: exiting _queue_task() for managed-node3/stat 13830 1727204099.02968: done queuing things up, now waiting for results queue to drain 13830 1727204099.02970: waiting for pending results... 13830 1727204099.03267: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204099.03426: in run() - task 0affcd87-79f5-1659-6b02-0000000007cb 13830 1727204099.03438: variable 'ansible_search_path' from source: unknown 13830 1727204099.03442: variable 'ansible_search_path' from source: unknown 13830 1727204099.03479: calling self._execute() 13830 1727204099.03572: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204099.03576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204099.03586: variable 'omit' from source: magic vars 13830 1727204099.03971: variable 'ansible_distribution_major_version' from source: facts 13830 1727204099.03984: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204099.04139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204099.04628: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204099.04663: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204099.04700: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204099.04736: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204099.04828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204099.04854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204099.04884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204099.04911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204099.05009: variable '__network_is_ostree' from source: set_fact 13830 1727204099.05012: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204099.05018: when evaluation is False, skipping this task 13830 1727204099.05021: _execute() done 13830 1727204099.05023: dumping result to json 13830 1727204099.05026: done dumping result, returning 13830 1727204099.05035: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1659-6b02-0000000007cb] 13830 1727204099.05048: sending task result for task 0affcd87-79f5-1659-6b02-0000000007cb 13830 1727204099.05150: done sending task result for task 0affcd87-79f5-1659-6b02-0000000007cb 13830 1727204099.05153: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204099.05212: no more pending results, returning what we have 13830 1727204099.05218: results queue empty 13830 1727204099.05219: checking for any_errors_fatal 13830 1727204099.05228: done checking for any_errors_fatal 13830 1727204099.05229: checking for max_fail_percentage 13830 1727204099.05231: done checking for max_fail_percentage 13830 1727204099.05232: checking to see if all hosts have failed and the running result is not ok 13830 1727204099.05233: done checking to see if all hosts have failed 13830 1727204099.05234: getting the remaining hosts for this loop 13830 1727204099.05236: done getting the remaining hosts for this loop 13830 1727204099.05240: getting the next task for host managed-node3 13830 1727204099.05249: done getting next task for host managed-node3 13830 1727204099.05255: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204099.05262: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204099.05282: getting variables 13830 1727204099.05285: in VariableManager get_vars() 13830 1727204099.05327: Calling all_inventory to load vars for managed-node3 13830 1727204099.05330: Calling groups_inventory to load vars for managed-node3 13830 1727204099.05332: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204099.05343: Calling all_plugins_play to load vars for managed-node3 13830 1727204099.05346: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204099.05348: Calling groups_plugins_play to load vars for managed-node3 13830 1727204099.07017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204099.08945: done with get_vars() 13830 1727204099.08980: done getting variables 13830 1727204099.09049: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.066) 0:00:32.169 ***** 13830 1727204099.09092: entering _queue_task() for managed-node3/set_fact 13830 1727204099.09599: worker is 1 (out of 1 available) 13830 1727204099.09612: exiting _queue_task() for managed-node3/set_fact 13830 1727204099.09624: done queuing things up, now waiting for results queue to drain 13830 1727204099.09626: waiting for pending results... 13830 1727204099.09979: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204099.10151: in run() - task 0affcd87-79f5-1659-6b02-0000000007cc 13830 1727204099.10167: variable 'ansible_search_path' from source: unknown 13830 1727204099.10172: variable 'ansible_search_path' from source: unknown 13830 1727204099.10262: calling self._execute() 13830 1727204099.10381: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204099.10385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204099.10401: variable 'omit' from source: magic vars 13830 1727204099.10857: variable 'ansible_distribution_major_version' from source: facts 13830 1727204099.10872: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204099.11113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204099.11506: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204099.11553: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204099.11588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204099.11627: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204099.11716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204099.11742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204099.11890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204099.11912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204099.12078: variable '__network_is_ostree' from source: set_fact 13830 1727204099.12095: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204099.12098: when evaluation is False, skipping this task 13830 1727204099.12101: _execute() done 13830 1727204099.12104: dumping result to json 13830 1727204099.12106: done dumping result, returning 13830 1727204099.12109: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1659-6b02-0000000007cc] 13830 1727204099.12111: sending task result for task 0affcd87-79f5-1659-6b02-0000000007cc 13830 1727204099.12217: done sending task result for task 0affcd87-79f5-1659-6b02-0000000007cc 13830 1727204099.12220: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204099.12279: no more pending results, returning what we have 13830 1727204099.12284: results queue empty 13830 1727204099.12287: checking for any_errors_fatal 13830 1727204099.12295: done checking for any_errors_fatal 13830 1727204099.12296: checking for max_fail_percentage 13830 1727204099.12297: done checking for max_fail_percentage 13830 1727204099.12298: checking to see if all hosts have failed and the running result is not ok 13830 1727204099.12299: done checking to see if all hosts have failed 13830 1727204099.12300: getting the remaining hosts for this loop 13830 1727204099.12302: done getting the remaining hosts for this loop 13830 1727204099.12307: getting the next task for host managed-node3 13830 1727204099.12318: done getting next task for host managed-node3 13830 1727204099.12322: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204099.12330: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204099.12348: getting variables 13830 1727204099.12350: in VariableManager get_vars() 13830 1727204099.12393: Calling all_inventory to load vars for managed-node3 13830 1727204099.12396: Calling groups_inventory to load vars for managed-node3 13830 1727204099.12399: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204099.12410: Calling all_plugins_play to load vars for managed-node3 13830 1727204099.12413: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204099.12417: Calling groups_plugins_play to load vars for managed-node3 13830 1727204099.14399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204099.18267: done with get_vars() 13830 1727204099.18502: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.095) 0:00:32.264 ***** 13830 1727204099.18612: entering _queue_task() for managed-node3/service_facts 13830 1727204099.18952: worker is 1 (out of 1 available) 13830 1727204099.19466: exiting _queue_task() for managed-node3/service_facts 13830 1727204099.19479: done queuing things up, now waiting for results queue to drain 13830 1727204099.19481: waiting for pending results... 13830 1727204099.19919: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204099.20218: in run() - task 0affcd87-79f5-1659-6b02-0000000007ce 13830 1727204099.20358: variable 'ansible_search_path' from source: unknown 13830 1727204099.20373: variable 'ansible_search_path' from source: unknown 13830 1727204099.20417: calling self._execute() 13830 1727204099.20535: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204099.20674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204099.20689: variable 'omit' from source: magic vars 13830 1727204099.21387: variable 'ansible_distribution_major_version' from source: facts 13830 1727204099.21484: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204099.21544: variable 'omit' from source: magic vars 13830 1727204099.21744: variable 'omit' from source: magic vars 13830 1727204099.21788: variable 'omit' from source: magic vars 13830 1727204099.21905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204099.22000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204099.22098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204099.22121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204099.22187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204099.22224: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204099.22289: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204099.22299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204099.22459: Set connection var ansible_connection to ssh 13830 1727204099.22540: Set connection var ansible_timeout to 10 13830 1727204099.23280: Set connection var ansible_shell_executable to /bin/sh 13830 1727204099.23296: Set connection var ansible_shell_type to sh 13830 1727204099.23479: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204099.23559: Set connection var ansible_pipelining to False 13830 1727204099.23591: variable 'ansible_shell_executable' from source: unknown 13830 1727204099.23599: variable 'ansible_connection' from source: unknown 13830 1727204099.23606: variable 'ansible_module_compression' from source: unknown 13830 1727204099.23614: variable 'ansible_shell_type' from source: unknown 13830 1727204099.23620: variable 'ansible_shell_executable' from source: unknown 13830 1727204099.23724: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204099.23737: variable 'ansible_pipelining' from source: unknown 13830 1727204099.23754: variable 'ansible_timeout' from source: unknown 13830 1727204099.23763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204099.23962: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204099.24289: variable 'omit' from source: magic vars 13830 1727204099.24299: starting attempt loop 13830 1727204099.24305: running the handler 13830 1727204099.24324: _low_level_execute_command(): starting 13830 1727204099.24470: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204099.27347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.27362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.27367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.27493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204099.27498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.27580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204099.27584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204099.27696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204099.27908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204099.29534: stdout chunk (state=3): >>>/root <<< 13830 1727204099.29635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204099.29724: stderr chunk (state=3): >>><<< 13830 1727204099.29727: stdout chunk (state=3): >>><<< 13830 1727204099.29838: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204099.29842: _low_level_execute_command(): starting 13830 1727204099.29845: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257 `" && echo ansible-tmp-1727204099.2974768-16026-172538011757257="` echo /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257 `" ) && sleep 0' 13830 1727204099.31280: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204099.31284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.31319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.31330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.31335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.31513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204099.31577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204099.31775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204099.33729: stdout chunk (state=3): >>>ansible-tmp-1727204099.2974768-16026-172538011757257=/root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257 <<< 13830 1727204099.33839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204099.33924: stderr chunk (state=3): >>><<< 13830 1727204099.33927: stdout chunk (state=3): >>><<< 13830 1727204099.34073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204099.2974768-16026-172538011757257=/root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204099.34076: variable 'ansible_module_compression' from source: unknown 13830 1727204099.34079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13830 1727204099.34081: variable 'ansible_facts' from source: unknown 13830 1727204099.34170: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257/AnsiballZ_service_facts.py 13830 1727204099.35295: Sending initial data 13830 1727204099.35298: Sent initial data (162 bytes) 13830 1727204099.37445: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204099.37597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204099.37615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.37634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.37686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204099.37702: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204099.37717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.37735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204099.37747: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204099.37757: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204099.37771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204099.37784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.37803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.37818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204099.37829: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204099.37842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.38013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204099.38041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204099.38058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204099.38141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204099.40031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204099.40042: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204099.40088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp7nflr6ba /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257/AnsiballZ_service_facts.py <<< 13830 1727204099.40131: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204099.41457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204099.41544: stderr chunk (state=3): >>><<< 13830 1727204099.41548: stdout chunk (state=3): >>><<< 13830 1727204099.41566: done transferring module to remote 13830 1727204099.41579: _low_level_execute_command(): starting 13830 1727204099.41584: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257/ /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257/AnsiballZ_service_facts.py && sleep 0' 13830 1727204099.43293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204099.43299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204099.43302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.43305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.43628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204099.43632: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204099.43634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.43641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204099.43655: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204099.43658: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204099.43661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204099.43663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.43667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.43670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204099.43672: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204099.43674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.43677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204099.43679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204099.43681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204099.43879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204099.45888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204099.45918: stderr chunk (state=3): >>><<< 13830 1727204099.45922: stdout chunk (state=3): >>><<< 13830 1727204099.45943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204099.45946: _low_level_execute_command(): starting 13830 1727204099.45951: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257/AnsiballZ_service_facts.py && sleep 0' 13830 1727204099.47576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204099.47584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204099.47593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.47607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.47694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204099.47701: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204099.47710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.47722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204099.47772: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204099.47778: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204099.47786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204099.47794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204099.47805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204099.47812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204099.47818: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204099.47826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204099.48015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204099.48030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204099.48042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204099.48208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204100.83453: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 13830 1727204100.83488: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 13830 1727204100.83492: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 13830 1727204100.83496: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name<<< 13830 1727204100.83500: stdout chunk (state=3): >>>": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed<<< 13830 1727204100.83507: stdout chunk (state=3): >>>.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13830 1727204100.84838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204100.84843: stdout chunk (state=3): >>><<< 13830 1727204100.84850: stderr chunk (state=3): >>><<< 13830 1727204100.84884: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204100.85886: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204100.85895: _low_level_execute_command(): starting 13830 1727204100.85900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204099.2974768-16026-172538011757257/ > /dev/null 2>&1 && sleep 0' 13830 1727204100.87152: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204100.87174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204100.87184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204100.87198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204100.87237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204100.87245: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204100.87255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204100.87283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204100.87290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204100.87297: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204100.87305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204100.87313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204100.87323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204100.87387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204100.87395: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204100.87409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204100.87484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204100.87504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204100.87511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204100.87588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204100.89433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204100.89437: stdout chunk (state=3): >>><<< 13830 1727204100.89440: stderr chunk (state=3): >>><<< 13830 1727204100.89442: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204100.89445: handler run complete 13830 1727204100.89625: variable 'ansible_facts' from source: unknown 13830 1727204100.89824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204100.90496: variable 'ansible_facts' from source: unknown 13830 1727204100.90843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204100.91235: attempt loop complete, returning result 13830 1727204100.91512: _execute() done 13830 1727204100.91532: dumping result to json 13830 1727204100.91605: done dumping result, returning 13830 1727204100.92709: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1659-6b02-0000000007ce] 13830 1727204100.93087: sending task result for task 0affcd87-79f5-1659-6b02-0000000007ce ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204100.94325: no more pending results, returning what we have 13830 1727204100.94329: results queue empty 13830 1727204100.94330: checking for any_errors_fatal 13830 1727204100.94335: done checking for any_errors_fatal 13830 1727204100.94336: checking for max_fail_percentage 13830 1727204100.94338: done checking for max_fail_percentage 13830 1727204100.94339: checking to see if all hosts have failed and the running result is not ok 13830 1727204100.94340: done checking to see if all hosts have failed 13830 1727204100.94341: getting the remaining hosts for this loop 13830 1727204100.94342: done getting the remaining hosts for this loop 13830 1727204100.94346: getting the next task for host managed-node3 13830 1727204100.94354: done getting next task for host managed-node3 13830 1727204100.94357: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204100.94363: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204100.94384: getting variables 13830 1727204100.94385: in VariableManager get_vars() 13830 1727204100.94392: done sending task result for task 0affcd87-79f5-1659-6b02-0000000007ce 13830 1727204100.94432: Calling all_inventory to load vars for managed-node3 13830 1727204100.94440: Calling groups_inventory to load vars for managed-node3 13830 1727204100.94443: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204100.94454: Calling all_plugins_play to load vars for managed-node3 13830 1727204100.94458: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204100.94462: Calling groups_plugins_play to load vars for managed-node3 13830 1727204100.95048: WORKER PROCESS EXITING 13830 1727204100.97814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204101.01993: done with get_vars() 13830 1727204101.02031: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:01 -0400 (0:00:01.836) 0:00:34.100 ***** 13830 1727204101.02258: entering _queue_task() for managed-node3/package_facts 13830 1727204101.03137: worker is 1 (out of 1 available) 13830 1727204101.03151: exiting _queue_task() for managed-node3/package_facts 13830 1727204101.03166: done queuing things up, now waiting for results queue to drain 13830 1727204101.03168: waiting for pending results... 13830 1727204101.04097: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204101.04572: in run() - task 0affcd87-79f5-1659-6b02-0000000007cf 13830 1727204101.04576: variable 'ansible_search_path' from source: unknown 13830 1727204101.04579: variable 'ansible_search_path' from source: unknown 13830 1727204101.04663: calling self._execute() 13830 1727204101.04759: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204101.04765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204101.04896: variable 'omit' from source: magic vars 13830 1727204101.05590: variable 'ansible_distribution_major_version' from source: facts 13830 1727204101.05603: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204101.05607: variable 'omit' from source: magic vars 13830 1727204101.05812: variable 'omit' from source: magic vars 13830 1727204101.05843: variable 'omit' from source: magic vars 13830 1727204101.06003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204101.06038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204101.06059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204101.06200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204101.06210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204101.06240: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204101.06244: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204101.06247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204101.06469: Set connection var ansible_connection to ssh 13830 1727204101.06479: Set connection var ansible_timeout to 10 13830 1727204101.06485: Set connection var ansible_shell_executable to /bin/sh 13830 1727204101.06488: Set connection var ansible_shell_type to sh 13830 1727204101.06493: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204101.06618: Set connection var ansible_pipelining to False 13830 1727204101.06646: variable 'ansible_shell_executable' from source: unknown 13830 1727204101.06650: variable 'ansible_connection' from source: unknown 13830 1727204101.06653: variable 'ansible_module_compression' from source: unknown 13830 1727204101.06655: variable 'ansible_shell_type' from source: unknown 13830 1727204101.06657: variable 'ansible_shell_executable' from source: unknown 13830 1727204101.06659: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204101.06662: variable 'ansible_pipelining' from source: unknown 13830 1727204101.06668: variable 'ansible_timeout' from source: unknown 13830 1727204101.06673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204101.07102: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204101.07112: variable 'omit' from source: magic vars 13830 1727204101.07117: starting attempt loop 13830 1727204101.07120: running the handler 13830 1727204101.07136: _low_level_execute_command(): starting 13830 1727204101.07142: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204101.09143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204101.09155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.09174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.09190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.09267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.09285: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204101.09295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.09348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204101.09358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204101.09367: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204101.09375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.09387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.09404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.09412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.09420: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204101.09432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.09507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204101.09641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204101.09645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204101.09742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204101.11414: stdout chunk (state=3): >>>/root <<< 13830 1727204101.11602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204101.11609: stdout chunk (state=3): >>><<< 13830 1727204101.11620: stderr chunk (state=3): >>><<< 13830 1727204101.11648: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204101.11663: _low_level_execute_command(): starting 13830 1727204101.11671: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415 `" && echo ansible-tmp-1727204101.1164746-16381-60199047286415="` echo /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415 `" ) && sleep 0' 13830 1727204101.13248: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204101.13256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.13269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.13292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.13336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.13396: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204101.13405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.13420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204101.13428: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204101.13436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204101.13442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.13452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.13462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.13470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.13478: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204101.13487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.13562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204101.13730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204101.13741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204101.13899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204101.15916: stdout chunk (state=3): >>>ansible-tmp-1727204101.1164746-16381-60199047286415=/root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415 <<< 13830 1727204101.16102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204101.16105: stdout chunk (state=3): >>><<< 13830 1727204101.16121: stderr chunk (state=3): >>><<< 13830 1727204101.16138: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204101.1164746-16381-60199047286415=/root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204101.16189: variable 'ansible_module_compression' from source: unknown 13830 1727204101.16244: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13830 1727204101.16303: variable 'ansible_facts' from source: unknown 13830 1727204101.16461: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415/AnsiballZ_package_facts.py 13830 1727204101.17152: Sending initial data 13830 1727204101.17156: Sent initial data (161 bytes) 13830 1727204101.19486: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204101.19545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.19556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.19573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.19681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.19687: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204101.19697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.19710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204101.19718: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204101.19725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204101.19734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.19743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.19758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.19868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.19877: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204101.19887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.19960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204101.19982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204101.19987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204101.20192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204101.21920: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204101.21960: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204101.22000: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpbiv_b1ve /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415/AnsiballZ_package_facts.py <<< 13830 1727204101.22041: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204101.25191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204101.25289: stderr chunk (state=3): >>><<< 13830 1727204101.25293: stdout chunk (state=3): >>><<< 13830 1727204101.25319: done transferring module to remote 13830 1727204101.25333: _low_level_execute_command(): starting 13830 1727204101.25337: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415/ /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415/AnsiballZ_package_facts.py && sleep 0' 13830 1727204101.27053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204101.27062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.27080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.27094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.27149: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.27191: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204101.27201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.27216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204101.27229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204101.27240: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204101.27248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.27258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.27282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.27291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.27299: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204101.27316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.27483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204101.27499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204101.27502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204101.27642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204101.29416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204101.29420: stdout chunk (state=3): >>><<< 13830 1727204101.29425: stderr chunk (state=3): >>><<< 13830 1727204101.29451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204101.29454: _low_level_execute_command(): starting 13830 1727204101.29457: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415/AnsiballZ_package_facts.py && sleep 0' 13830 1727204101.31184: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204101.31283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.31295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.32136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.32183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.32190: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204101.32200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.32217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204101.32340: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204101.32348: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204101.32356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.32369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.32381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.32389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.32396: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204101.32405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.32518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204101.32565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204101.32572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204101.32778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204101.79583: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.e<<< 13830 1727204101.79677: stdout chunk (state=3): >>>l9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", <<< 13830 1727204101.79686: stdout chunk (state=3): >>>"source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "<<< 13830 1727204101.79728: stdout chunk (state=3): >>>rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch":<<< 13830 1727204101.79786: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-li<<< 13830 1727204101.79809: stdout chunk (state=3): >>>bev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13830 1727204101.81285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204101.81344: stderr chunk (state=3): >>><<< 13830 1727204101.81348: stdout chunk (state=3): >>><<< 13830 1727204101.81397: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204101.86553: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204101.86726: _low_level_execute_command(): starting 13830 1727204101.86735: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204101.1164746-16381-60199047286415/ > /dev/null 2>&1 && sleep 0' 13830 1727204101.88516: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204101.88582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.88592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.88607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.88646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.88793: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204101.88804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.88855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204101.88859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204101.88862: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204101.88866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204101.88868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204101.88871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204101.88963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204101.88968: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204101.88971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204101.89143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204101.89150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204101.89157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204101.89279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204101.91184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204101.91223: stderr chunk (state=3): >>><<< 13830 1727204101.91227: stdout chunk (state=3): >>><<< 13830 1727204101.91242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204101.91248: handler run complete 13830 1727204101.92656: variable 'ansible_facts' from source: unknown 13830 1727204101.93733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204102.00871: variable 'ansible_facts' from source: unknown 13830 1727204102.02749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204102.05036: attempt loop complete, returning result 13830 1727204102.05095: _execute() done 13830 1727204102.05099: dumping result to json 13830 1727204102.05945: done dumping result, returning 13830 1727204102.05956: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1659-6b02-0000000007cf] 13830 1727204102.05962: sending task result for task 0affcd87-79f5-1659-6b02-0000000007cf ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204102.11945: done sending task result for task 0affcd87-79f5-1659-6b02-0000000007cf 13830 1727204102.11952: WORKER PROCESS EXITING 13830 1727204102.11962: no more pending results, returning what we have 13830 1727204102.11968: results queue empty 13830 1727204102.11969: checking for any_errors_fatal 13830 1727204102.11976: done checking for any_errors_fatal 13830 1727204102.11976: checking for max_fail_percentage 13830 1727204102.11978: done checking for max_fail_percentage 13830 1727204102.11979: checking to see if all hosts have failed and the running result is not ok 13830 1727204102.11980: done checking to see if all hosts have failed 13830 1727204102.11981: getting the remaining hosts for this loop 13830 1727204102.11982: done getting the remaining hosts for this loop 13830 1727204102.11986: getting the next task for host managed-node3 13830 1727204102.11993: done getting next task for host managed-node3 13830 1727204102.11998: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204102.12003: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204102.12017: getting variables 13830 1727204102.12018: in VariableManager get_vars() 13830 1727204102.12057: Calling all_inventory to load vars for managed-node3 13830 1727204102.12060: Calling groups_inventory to load vars for managed-node3 13830 1727204102.12108: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204102.12119: Calling all_plugins_play to load vars for managed-node3 13830 1727204102.12122: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204102.12147: Calling groups_plugins_play to load vars for managed-node3 13830 1727204102.16274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204102.21542: done with get_vars() 13830 1727204102.21615: done getting variables 13830 1727204102.21682: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:02 -0400 (0:00:01.196) 0:00:35.297 ***** 13830 1727204102.22029: entering _queue_task() for managed-node3/debug 13830 1727204102.22648: worker is 1 (out of 1 available) 13830 1727204102.22661: exiting _queue_task() for managed-node3/debug 13830 1727204102.22799: done queuing things up, now waiting for results queue to drain 13830 1727204102.22801: waiting for pending results... 13830 1727204102.23543: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204102.23892: in run() - task 0affcd87-79f5-1659-6b02-000000000694 13830 1727204102.23993: variable 'ansible_search_path' from source: unknown 13830 1727204102.24003: variable 'ansible_search_path' from source: unknown 13830 1727204102.24051: calling self._execute() 13830 1727204102.24437: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204102.24500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204102.24515: variable 'omit' from source: magic vars 13830 1727204102.26036: variable 'ansible_distribution_major_version' from source: facts 13830 1727204102.26715: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204102.26729: variable 'omit' from source: magic vars 13830 1727204102.26819: variable 'omit' from source: magic vars 13830 1727204102.26940: variable 'network_provider' from source: set_fact 13830 1727204102.26966: variable 'omit' from source: magic vars 13830 1727204102.27022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204102.27072: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204102.27100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204102.27123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204102.27144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204102.27183: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204102.27475: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204102.27483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204102.27590: Set connection var ansible_connection to ssh 13830 1727204102.27607: Set connection var ansible_timeout to 10 13830 1727204102.27618: Set connection var ansible_shell_executable to /bin/sh 13830 1727204102.27625: Set connection var ansible_shell_type to sh 13830 1727204102.27638: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204102.27653: Set connection var ansible_pipelining to False 13830 1727204102.27685: variable 'ansible_shell_executable' from source: unknown 13830 1727204102.27696: variable 'ansible_connection' from source: unknown 13830 1727204102.27703: variable 'ansible_module_compression' from source: unknown 13830 1727204102.27710: variable 'ansible_shell_type' from source: unknown 13830 1727204102.27717: variable 'ansible_shell_executable' from source: unknown 13830 1727204102.27723: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204102.27733: variable 'ansible_pipelining' from source: unknown 13830 1727204102.27740: variable 'ansible_timeout' from source: unknown 13830 1727204102.27748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204102.28007: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204102.28069: variable 'omit' from source: magic vars 13830 1727204102.28079: starting attempt loop 13830 1727204102.28086: running the handler 13830 1727204102.28215: handler run complete 13830 1727204102.28338: attempt loop complete, returning result 13830 1727204102.28348: _execute() done 13830 1727204102.28355: dumping result to json 13830 1727204102.28362: done dumping result, returning 13830 1727204102.28375: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1659-6b02-000000000694] 13830 1727204102.28385: sending task result for task 0affcd87-79f5-1659-6b02-000000000694 ok: [managed-node3] => {} MSG: Using network provider: nm 13830 1727204102.28551: done sending task result for task 0affcd87-79f5-1659-6b02-000000000694 13830 1727204102.28556: WORKER PROCESS EXITING 13830 1727204102.28571: no more pending results, returning what we have 13830 1727204102.28577: results queue empty 13830 1727204102.28578: checking for any_errors_fatal 13830 1727204102.28591: done checking for any_errors_fatal 13830 1727204102.28592: checking for max_fail_percentage 13830 1727204102.28594: done checking for max_fail_percentage 13830 1727204102.28596: checking to see if all hosts have failed and the running result is not ok 13830 1727204102.28596: done checking to see if all hosts have failed 13830 1727204102.28597: getting the remaining hosts for this loop 13830 1727204102.28599: done getting the remaining hosts for this loop 13830 1727204102.28603: getting the next task for host managed-node3 13830 1727204102.28611: done getting next task for host managed-node3 13830 1727204102.28615: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204102.28620: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204102.28630: getting variables 13830 1727204102.28632: in VariableManager get_vars() 13830 1727204102.28672: Calling all_inventory to load vars for managed-node3 13830 1727204102.28675: Calling groups_inventory to load vars for managed-node3 13830 1727204102.28677: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204102.28686: Calling all_plugins_play to load vars for managed-node3 13830 1727204102.28688: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204102.28690: Calling groups_plugins_play to load vars for managed-node3 13830 1727204102.31184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204102.34958: done with get_vars() 13830 1727204102.34996: done getting variables 13830 1727204102.35188: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.132) 0:00:35.430 ***** 13830 1727204102.35237: entering _queue_task() for managed-node3/fail 13830 1727204102.36072: worker is 1 (out of 1 available) 13830 1727204102.36087: exiting _queue_task() for managed-node3/fail 13830 1727204102.36100: done queuing things up, now waiting for results queue to drain 13830 1727204102.36102: waiting for pending results... 13830 1727204102.36887: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204102.37297: in run() - task 0affcd87-79f5-1659-6b02-000000000695 13830 1727204102.37318: variable 'ansible_search_path' from source: unknown 13830 1727204102.37326: variable 'ansible_search_path' from source: unknown 13830 1727204102.37488: calling self._execute() 13830 1727204102.37702: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204102.37714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204102.37727: variable 'omit' from source: magic vars 13830 1727204102.38337: variable 'ansible_distribution_major_version' from source: facts 13830 1727204102.38471: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204102.38804: variable 'network_state' from source: role '' defaults 13830 1727204102.38821: Evaluated conditional (network_state != {}): False 13830 1727204102.38829: when evaluation is False, skipping this task 13830 1727204102.38839: _execute() done 13830 1727204102.38846: dumping result to json 13830 1727204102.38853: done dumping result, returning 13830 1727204102.38867: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1659-6b02-000000000695] 13830 1727204102.38879: sending task result for task 0affcd87-79f5-1659-6b02-000000000695 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204102.39045: no more pending results, returning what we have 13830 1727204102.39050: results queue empty 13830 1727204102.39051: checking for any_errors_fatal 13830 1727204102.39058: done checking for any_errors_fatal 13830 1727204102.39059: checking for max_fail_percentage 13830 1727204102.39061: done checking for max_fail_percentage 13830 1727204102.39062: checking to see if all hosts have failed and the running result is not ok 13830 1727204102.39062: done checking to see if all hosts have failed 13830 1727204102.39063: getting the remaining hosts for this loop 13830 1727204102.39066: done getting the remaining hosts for this loop 13830 1727204102.39071: getting the next task for host managed-node3 13830 1727204102.39081: done getting next task for host managed-node3 13830 1727204102.39086: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204102.39092: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204102.39114: getting variables 13830 1727204102.39116: in VariableManager get_vars() 13830 1727204102.39158: Calling all_inventory to load vars for managed-node3 13830 1727204102.39161: Calling groups_inventory to load vars for managed-node3 13830 1727204102.39163: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204102.39178: Calling all_plugins_play to load vars for managed-node3 13830 1727204102.39180: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204102.39183: Calling groups_plugins_play to load vars for managed-node3 13830 1727204102.39852: done sending task result for task 0affcd87-79f5-1659-6b02-000000000695 13830 1727204102.39856: WORKER PROCESS EXITING 13830 1727204102.41928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204102.45641: done with get_vars() 13830 1727204102.45796: done getting variables 13830 1727204102.45861: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.107) 0:00:35.538 ***** 13830 1727204102.46026: entering _queue_task() for managed-node3/fail 13830 1727204102.46742: worker is 1 (out of 1 available) 13830 1727204102.46876: exiting _queue_task() for managed-node3/fail 13830 1727204102.46889: done queuing things up, now waiting for results queue to drain 13830 1727204102.46891: waiting for pending results... 13830 1727204102.48248: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204102.48533: in run() - task 0affcd87-79f5-1659-6b02-000000000696 13830 1727204102.48599: variable 'ansible_search_path' from source: unknown 13830 1727204102.48696: variable 'ansible_search_path' from source: unknown 13830 1727204102.48742: calling self._execute() 13830 1727204102.48986: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204102.49089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204102.49103: variable 'omit' from source: magic vars 13830 1727204102.50509: variable 'ansible_distribution_major_version' from source: facts 13830 1727204102.50751: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204102.51307: variable 'network_state' from source: role '' defaults 13830 1727204102.51325: Evaluated conditional (network_state != {}): False 13830 1727204102.51335: when evaluation is False, skipping this task 13830 1727204102.51342: _execute() done 13830 1727204102.51349: dumping result to json 13830 1727204102.51356: done dumping result, returning 13830 1727204102.51369: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1659-6b02-000000000696] 13830 1727204102.51385: sending task result for task 0affcd87-79f5-1659-6b02-000000000696 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204102.51808: no more pending results, returning what we have 13830 1727204102.51813: results queue empty 13830 1727204102.51813: checking for any_errors_fatal 13830 1727204102.51820: done checking for any_errors_fatal 13830 1727204102.51821: checking for max_fail_percentage 13830 1727204102.51823: done checking for max_fail_percentage 13830 1727204102.51824: checking to see if all hosts have failed and the running result is not ok 13830 1727204102.51824: done checking to see if all hosts have failed 13830 1727204102.51825: getting the remaining hosts for this loop 13830 1727204102.51828: done getting the remaining hosts for this loop 13830 1727204102.51832: getting the next task for host managed-node3 13830 1727204102.51840: done getting next task for host managed-node3 13830 1727204102.51844: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204102.51850: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204102.51869: done sending task result for task 0affcd87-79f5-1659-6b02-000000000696 13830 1727204102.51874: WORKER PROCESS EXITING 13830 1727204102.51922: getting variables 13830 1727204102.51925: in VariableManager get_vars() 13830 1727204102.51972: Calling all_inventory to load vars for managed-node3 13830 1727204102.51976: Calling groups_inventory to load vars for managed-node3 13830 1727204102.51978: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204102.51991: Calling all_plugins_play to load vars for managed-node3 13830 1727204102.51993: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204102.51995: Calling groups_plugins_play to load vars for managed-node3 13830 1727204102.56039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204102.63637: done with get_vars() 13830 1727204102.63681: done getting variables 13830 1727204102.63980: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.179) 0:00:35.718 ***** 13830 1727204102.64022: entering _queue_task() for managed-node3/fail 13830 1727204102.65279: worker is 1 (out of 1 available) 13830 1727204102.65295: exiting _queue_task() for managed-node3/fail 13830 1727204102.65307: done queuing things up, now waiting for results queue to drain 13830 1727204102.65309: waiting for pending results... 13830 1727204102.65949: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204102.67224: in run() - task 0affcd87-79f5-1659-6b02-000000000697 13830 1727204102.67258: variable 'ansible_search_path' from source: unknown 13830 1727204102.67359: variable 'ansible_search_path' from source: unknown 13830 1727204102.67406: calling self._execute() 13830 1727204102.67628: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204102.67643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204102.67657: variable 'omit' from source: magic vars 13830 1727204102.68474: variable 'ansible_distribution_major_version' from source: facts 13830 1727204102.68562: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204102.68988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204102.76368: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204102.76699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204102.76756: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204102.76947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204102.76984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204102.77195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204102.77229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204102.77500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204102.77549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204102.77571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204102.77803: variable 'ansible_distribution_major_version' from source: facts 13830 1727204102.78156: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13830 1727204102.78166: when evaluation is False, skipping this task 13830 1727204102.78173: _execute() done 13830 1727204102.78179: dumping result to json 13830 1727204102.78186: done dumping result, returning 13830 1727204102.78198: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1659-6b02-000000000697] 13830 1727204102.78209: sending task result for task 0affcd87-79f5-1659-6b02-000000000697 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13830 1727204102.78398: no more pending results, returning what we have 13830 1727204102.78405: results queue empty 13830 1727204102.78406: checking for any_errors_fatal 13830 1727204102.78414: done checking for any_errors_fatal 13830 1727204102.78415: checking for max_fail_percentage 13830 1727204102.78417: done checking for max_fail_percentage 13830 1727204102.78418: checking to see if all hosts have failed and the running result is not ok 13830 1727204102.78419: done checking to see if all hosts have failed 13830 1727204102.78420: getting the remaining hosts for this loop 13830 1727204102.78422: done getting the remaining hosts for this loop 13830 1727204102.78428: getting the next task for host managed-node3 13830 1727204102.78437: done getting next task for host managed-node3 13830 1727204102.78441: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204102.78448: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204102.78464: done sending task result for task 0affcd87-79f5-1659-6b02-000000000697 13830 1727204102.78479: getting variables 13830 1727204102.78482: in VariableManager get_vars() 13830 1727204102.78524: Calling all_inventory to load vars for managed-node3 13830 1727204102.78529: Calling groups_inventory to load vars for managed-node3 13830 1727204102.78532: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204102.78544: Calling all_plugins_play to load vars for managed-node3 13830 1727204102.78547: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204102.78551: Calling groups_plugins_play to load vars for managed-node3 13830 1727204102.79120: WORKER PROCESS EXITING 13830 1727204102.81906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204102.86138: done with get_vars() 13830 1727204102.86178: done getting variables 13830 1727204102.86258: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.222) 0:00:35.941 ***** 13830 1727204102.86298: entering _queue_task() for managed-node3/dnf 13830 1727204102.86992: worker is 1 (out of 1 available) 13830 1727204102.87007: exiting _queue_task() for managed-node3/dnf 13830 1727204102.87026: done queuing things up, now waiting for results queue to drain 13830 1727204102.87028: waiting for pending results... 13830 1727204102.87395: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204102.87575: in run() - task 0affcd87-79f5-1659-6b02-000000000698 13830 1727204102.87599: variable 'ansible_search_path' from source: unknown 13830 1727204102.87606: variable 'ansible_search_path' from source: unknown 13830 1727204102.87669: calling self._execute() 13830 1727204102.87826: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204102.87841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204102.87856: variable 'omit' from source: magic vars 13830 1727204102.88285: variable 'ansible_distribution_major_version' from source: facts 13830 1727204102.88306: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204102.88540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204102.91436: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204102.91537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204102.91624: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204102.91685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204102.91746: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204102.92351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204102.92390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204102.92428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204102.92483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204102.92510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204102.92651: variable 'ansible_distribution' from source: facts 13830 1727204102.92661: variable 'ansible_distribution_major_version' from source: facts 13830 1727204102.92683: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13830 1727204102.92819: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204102.92972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204102.93002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204102.93030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204102.93084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204102.93103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204102.93153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204102.93189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204102.93219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204102.93383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204102.93407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204102.93453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204102.93524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204102.93633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204102.93724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204102.93748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204102.94105: variable 'network_connections' from source: task vars 13830 1727204102.94174: variable 'port2_profile' from source: play vars 13830 1727204102.94327: variable 'port2_profile' from source: play vars 13830 1727204102.94373: variable 'port1_profile' from source: play vars 13830 1727204102.94445: variable 'port1_profile' from source: play vars 13830 1727204102.94459: variable 'controller_profile' from source: play vars 13830 1727204102.94529: variable 'controller_profile' from source: play vars 13830 1727204102.94617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204102.94832: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204102.94878: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204102.94921: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204102.94959: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204102.95017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204102.95054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204102.95082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204102.95106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204102.95176: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204102.95845: variable 'network_connections' from source: task vars 13830 1727204102.95855: variable 'port2_profile' from source: play vars 13830 1727204102.95924: variable 'port2_profile' from source: play vars 13830 1727204102.95943: variable 'port1_profile' from source: play vars 13830 1727204102.96017: variable 'port1_profile' from source: play vars 13830 1727204102.96029: variable 'controller_profile' from source: play vars 13830 1727204102.96093: variable 'controller_profile' from source: play vars 13830 1727204102.96128: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204102.96138: when evaluation is False, skipping this task 13830 1727204102.96146: _execute() done 13830 1727204102.96153: dumping result to json 13830 1727204102.96160: done dumping result, returning 13830 1727204102.96175: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000698] 13830 1727204102.96186: sending task result for task 0affcd87-79f5-1659-6b02-000000000698 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204102.96371: no more pending results, returning what we have 13830 1727204102.96377: results queue empty 13830 1727204102.96378: checking for any_errors_fatal 13830 1727204102.96387: done checking for any_errors_fatal 13830 1727204102.96388: checking for max_fail_percentage 13830 1727204102.96389: done checking for max_fail_percentage 13830 1727204102.96390: checking to see if all hosts have failed and the running result is not ok 13830 1727204102.96391: done checking to see if all hosts have failed 13830 1727204102.96392: getting the remaining hosts for this loop 13830 1727204102.96393: done getting the remaining hosts for this loop 13830 1727204102.96398: getting the next task for host managed-node3 13830 1727204102.96406: done getting next task for host managed-node3 13830 1727204102.96411: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204102.96417: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204102.96439: getting variables 13830 1727204102.96441: in VariableManager get_vars() 13830 1727204102.96484: Calling all_inventory to load vars for managed-node3 13830 1727204102.96487: Calling groups_inventory to load vars for managed-node3 13830 1727204102.96489: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204102.96499: Calling all_plugins_play to load vars for managed-node3 13830 1727204102.96502: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204102.96504: Calling groups_plugins_play to load vars for managed-node3 13830 1727204102.97833: done sending task result for task 0affcd87-79f5-1659-6b02-000000000698 13830 1727204102.97837: WORKER PROCESS EXITING 13830 1727204102.99707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204103.05912: done with get_vars() 13830 1727204103.05951: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204103.06038: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.199) 0:00:36.140 ***** 13830 1727204103.06279: entering _queue_task() for managed-node3/yum 13830 1727204103.06644: worker is 1 (out of 1 available) 13830 1727204103.06657: exiting _queue_task() for managed-node3/yum 13830 1727204103.06671: done queuing things up, now waiting for results queue to drain 13830 1727204103.06673: waiting for pending results... 13830 1727204103.07675: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204103.08289: in run() - task 0affcd87-79f5-1659-6b02-000000000699 13830 1727204103.08315: variable 'ansible_search_path' from source: unknown 13830 1727204103.08327: variable 'ansible_search_path' from source: unknown 13830 1727204103.08669: calling self._execute() 13830 1727204103.08939: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204103.08997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204103.09167: variable 'omit' from source: magic vars 13830 1727204103.09801: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.09829: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204103.10112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204103.17587: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204103.17784: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204103.17823: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204103.17976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204103.18001: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204103.18198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.18249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.18285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.18333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.18337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.18567: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.18971: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13830 1727204103.18974: when evaluation is False, skipping this task 13830 1727204103.18976: _execute() done 13830 1727204103.18977: dumping result to json 13830 1727204103.18979: done dumping result, returning 13830 1727204103.18981: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000699] 13830 1727204103.18983: sending task result for task 0affcd87-79f5-1659-6b02-000000000699 13830 1727204103.19059: done sending task result for task 0affcd87-79f5-1659-6b02-000000000699 13830 1727204103.19062: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13830 1727204103.19126: no more pending results, returning what we have 13830 1727204103.19133: results queue empty 13830 1727204103.19134: checking for any_errors_fatal 13830 1727204103.19165: done checking for any_errors_fatal 13830 1727204103.19167: checking for max_fail_percentage 13830 1727204103.19169: done checking for max_fail_percentage 13830 1727204103.19170: checking to see if all hosts have failed and the running result is not ok 13830 1727204103.19171: done checking to see if all hosts have failed 13830 1727204103.19171: getting the remaining hosts for this loop 13830 1727204103.19173: done getting the remaining hosts for this loop 13830 1727204103.19193: getting the next task for host managed-node3 13830 1727204103.19201: done getting next task for host managed-node3 13830 1727204103.19222: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204103.19228: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204103.19281: getting variables 13830 1727204103.19284: in VariableManager get_vars() 13830 1727204103.19340: Calling all_inventory to load vars for managed-node3 13830 1727204103.19343: Calling groups_inventory to load vars for managed-node3 13830 1727204103.19346: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204103.19393: Calling all_plugins_play to load vars for managed-node3 13830 1727204103.19398: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204103.19402: Calling groups_plugins_play to load vars for managed-node3 13830 1727204103.22802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204103.24818: done with get_vars() 13830 1727204103.24857: done getting variables 13830 1727204103.24938: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.187) 0:00:36.328 ***** 13830 1727204103.25029: entering _queue_task() for managed-node3/fail 13830 1727204103.25492: worker is 1 (out of 1 available) 13830 1727204103.25504: exiting _queue_task() for managed-node3/fail 13830 1727204103.25518: done queuing things up, now waiting for results queue to drain 13830 1727204103.25519: waiting for pending results... 13830 1727204103.25935: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204103.26115: in run() - task 0affcd87-79f5-1659-6b02-00000000069a 13830 1727204103.26178: variable 'ansible_search_path' from source: unknown 13830 1727204103.26190: variable 'ansible_search_path' from source: unknown 13830 1727204103.26258: calling self._execute() 13830 1727204103.26405: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204103.26438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204103.26453: variable 'omit' from source: magic vars 13830 1727204103.26992: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.27011: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204103.27241: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204103.27555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204103.31702: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204103.31777: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204103.31816: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204103.42253: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204103.42289: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204103.42385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.42414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.42455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.42502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.42516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.42590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.42614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.42638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.42696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.42710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.42755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.42797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.42821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.42866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.42891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.43076: variable 'network_connections' from source: task vars 13830 1727204103.43095: variable 'port2_profile' from source: play vars 13830 1727204103.43199: variable 'port2_profile' from source: play vars 13830 1727204103.43204: variable 'port1_profile' from source: play vars 13830 1727204103.43285: variable 'port1_profile' from source: play vars 13830 1727204103.43293: variable 'controller_profile' from source: play vars 13830 1727204103.43381: variable 'controller_profile' from source: play vars 13830 1727204103.43732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204103.43737: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204103.43739: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204103.43741: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204103.43760: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204103.44057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204103.44061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204103.44063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.44067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204103.44070: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204103.44480: variable 'network_connections' from source: task vars 13830 1727204103.44483: variable 'port2_profile' from source: play vars 13830 1727204103.44485: variable 'port2_profile' from source: play vars 13830 1727204103.44487: variable 'port1_profile' from source: play vars 13830 1727204103.44489: variable 'port1_profile' from source: play vars 13830 1727204103.44491: variable 'controller_profile' from source: play vars 13830 1727204103.44493: variable 'controller_profile' from source: play vars 13830 1727204103.44495: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204103.44504: when evaluation is False, skipping this task 13830 1727204103.44506: _execute() done 13830 1727204103.44509: dumping result to json 13830 1727204103.44511: done dumping result, returning 13830 1727204103.44512: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-00000000069a] 13830 1727204103.44519: sending task result for task 0affcd87-79f5-1659-6b02-00000000069a 13830 1727204103.44588: done sending task result for task 0affcd87-79f5-1659-6b02-00000000069a 13830 1727204103.44591: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204103.44640: no more pending results, returning what we have 13830 1727204103.44644: results queue empty 13830 1727204103.44644: checking for any_errors_fatal 13830 1727204103.44650: done checking for any_errors_fatal 13830 1727204103.44651: checking for max_fail_percentage 13830 1727204103.44652: done checking for max_fail_percentage 13830 1727204103.44653: checking to see if all hosts have failed and the running result is not ok 13830 1727204103.44654: done checking to see if all hosts have failed 13830 1727204103.44655: getting the remaining hosts for this loop 13830 1727204103.44656: done getting the remaining hosts for this loop 13830 1727204103.44660: getting the next task for host managed-node3 13830 1727204103.44668: done getting next task for host managed-node3 13830 1727204103.44672: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13830 1727204103.44677: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204103.44694: getting variables 13830 1727204103.44695: in VariableManager get_vars() 13830 1727204103.44738: Calling all_inventory to load vars for managed-node3 13830 1727204103.44741: Calling groups_inventory to load vars for managed-node3 13830 1727204103.44743: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204103.44752: Calling all_plugins_play to load vars for managed-node3 13830 1727204103.44754: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204103.44756: Calling groups_plugins_play to load vars for managed-node3 13830 1727204103.57615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204103.59276: done with get_vars() 13830 1727204103.59315: done getting variables 13830 1727204103.59369: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.343) 0:00:36.672 ***** 13830 1727204103.59405: entering _queue_task() for managed-node3/package 13830 1727204103.60079: worker is 1 (out of 1 available) 13830 1727204103.60093: exiting _queue_task() for managed-node3/package 13830 1727204103.60106: done queuing things up, now waiting for results queue to drain 13830 1727204103.60108: waiting for pending results... 13830 1727204103.60416: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 13830 1727204103.60574: in run() - task 0affcd87-79f5-1659-6b02-00000000069b 13830 1727204103.60586: variable 'ansible_search_path' from source: unknown 13830 1727204103.60591: variable 'ansible_search_path' from source: unknown 13830 1727204103.60633: calling self._execute() 13830 1727204103.60734: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204103.60738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204103.60746: variable 'omit' from source: magic vars 13830 1727204103.61124: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.61138: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204103.61356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204103.61718: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204103.61838: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204103.62035: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204103.62073: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204103.62367: variable 'network_packages' from source: role '' defaults 13830 1727204103.62618: variable '__network_provider_setup' from source: role '' defaults 13830 1727204103.62656: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204103.62710: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204103.62718: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204103.62788: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204103.62987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204103.65278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204103.65374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204103.65378: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204103.65397: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204103.65426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204103.65506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.65537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.65562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.65602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.65620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.65668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.65688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.65710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.65753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.65835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.66337: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204103.66571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.66710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.66734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.66780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.66794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.67108: variable 'ansible_python' from source: facts 13830 1727204103.67133: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204103.67259: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204103.67340: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204103.67473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.67496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.67524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.67565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.67583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.67633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204103.67654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204103.67680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.67719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204103.67737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204103.67916: variable 'network_connections' from source: task vars 13830 1727204103.67920: variable 'port2_profile' from source: play vars 13830 1727204103.68074: variable 'port2_profile' from source: play vars 13830 1727204103.68077: variable 'port1_profile' from source: play vars 13830 1727204103.68184: variable 'port1_profile' from source: play vars 13830 1727204103.68187: variable 'controller_profile' from source: play vars 13830 1727204103.68245: variable 'controller_profile' from source: play vars 13830 1727204103.68321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204103.68348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204103.68385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204103.68416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204103.68467: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204103.68770: variable 'network_connections' from source: task vars 13830 1727204103.68774: variable 'port2_profile' from source: play vars 13830 1727204103.68877: variable 'port2_profile' from source: play vars 13830 1727204103.68887: variable 'port1_profile' from source: play vars 13830 1727204103.68986: variable 'port1_profile' from source: play vars 13830 1727204103.68996: variable 'controller_profile' from source: play vars 13830 1727204103.69099: variable 'controller_profile' from source: play vars 13830 1727204103.69129: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204103.69206: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204103.69524: variable 'network_connections' from source: task vars 13830 1727204103.69527: variable 'port2_profile' from source: play vars 13830 1727204103.69602: variable 'port2_profile' from source: play vars 13830 1727204103.69609: variable 'port1_profile' from source: play vars 13830 1727204103.69676: variable 'port1_profile' from source: play vars 13830 1727204103.69690: variable 'controller_profile' from source: play vars 13830 1727204103.69752: variable 'controller_profile' from source: play vars 13830 1727204103.69779: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204103.69865: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204103.70474: variable 'network_connections' from source: task vars 13830 1727204103.70478: variable 'port2_profile' from source: play vars 13830 1727204103.70480: variable 'port2_profile' from source: play vars 13830 1727204103.70482: variable 'port1_profile' from source: play vars 13830 1727204103.70485: variable 'port1_profile' from source: play vars 13830 1727204103.70487: variable 'controller_profile' from source: play vars 13830 1727204103.70510: variable 'controller_profile' from source: play vars 13830 1727204103.70558: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204103.70618: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204103.70624: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204103.70692: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204103.71686: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204103.72795: variable 'network_connections' from source: task vars 13830 1727204103.72919: variable 'port2_profile' from source: play vars 13830 1727204103.72985: variable 'port2_profile' from source: play vars 13830 1727204103.72994: variable 'port1_profile' from source: play vars 13830 1727204103.73168: variable 'port1_profile' from source: play vars 13830 1727204103.73248: variable 'controller_profile' from source: play vars 13830 1727204103.73307: variable 'controller_profile' from source: play vars 13830 1727204103.73316: variable 'ansible_distribution' from source: facts 13830 1727204103.73319: variable '__network_rh_distros' from source: role '' defaults 13830 1727204103.73326: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.73456: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204103.73739: variable 'ansible_distribution' from source: facts 13830 1727204103.73743: variable '__network_rh_distros' from source: role '' defaults 13830 1727204103.73748: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.73762: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204103.74192: variable 'ansible_distribution' from source: facts 13830 1727204103.74195: variable '__network_rh_distros' from source: role '' defaults 13830 1727204103.74202: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.74355: variable 'network_provider' from source: set_fact 13830 1727204103.74373: variable 'ansible_facts' from source: unknown 13830 1727204103.75921: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13830 1727204103.75925: when evaluation is False, skipping this task 13830 1727204103.75928: _execute() done 13830 1727204103.75933: dumping result to json 13830 1727204103.75936: done dumping result, returning 13830 1727204103.75939: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1659-6b02-00000000069b] 13830 1727204103.75941: sending task result for task 0affcd87-79f5-1659-6b02-00000000069b skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13830 1727204103.76220: no more pending results, returning what we have 13830 1727204103.76224: results queue empty 13830 1727204103.76225: checking for any_errors_fatal 13830 1727204103.76234: done checking for any_errors_fatal 13830 1727204103.76235: checking for max_fail_percentage 13830 1727204103.76237: done checking for max_fail_percentage 13830 1727204103.76238: checking to see if all hosts have failed and the running result is not ok 13830 1727204103.76238: done checking to see if all hosts have failed 13830 1727204103.76239: getting the remaining hosts for this loop 13830 1727204103.76241: done getting the remaining hosts for this loop 13830 1727204103.76246: getting the next task for host managed-node3 13830 1727204103.76254: done getting next task for host managed-node3 13830 1727204103.76262: done sending task result for task 0affcd87-79f5-1659-6b02-00000000069b 13830 1727204103.76270: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204103.76277: WORKER PROCESS EXITING 13830 1727204103.76285: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204103.76310: getting variables 13830 1727204103.76312: in VariableManager get_vars() 13830 1727204103.76351: Calling all_inventory to load vars for managed-node3 13830 1727204103.76354: Calling groups_inventory to load vars for managed-node3 13830 1727204103.76356: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204103.76368: Calling all_plugins_play to load vars for managed-node3 13830 1727204103.76371: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204103.76374: Calling groups_plugins_play to load vars for managed-node3 13830 1727204103.79343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204103.83729: done with get_vars() 13830 1727204103.83868: done getting variables 13830 1727204103.83970: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.246) 0:00:36.918 ***** 13830 1727204103.84011: entering _queue_task() for managed-node3/package 13830 1727204103.84972: worker is 1 (out of 1 available) 13830 1727204103.84987: exiting _queue_task() for managed-node3/package 13830 1727204103.85000: done queuing things up, now waiting for results queue to drain 13830 1727204103.85002: waiting for pending results... 13830 1727204103.85983: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204103.86691: in run() - task 0affcd87-79f5-1659-6b02-00000000069c 13830 1727204103.86713: variable 'ansible_search_path' from source: unknown 13830 1727204103.86717: variable 'ansible_search_path' from source: unknown 13830 1727204103.86749: calling self._execute() 13830 1727204103.86962: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204103.86969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204103.87013: variable 'omit' from source: magic vars 13830 1727204103.88013: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.88026: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204103.88274: variable 'network_state' from source: role '' defaults 13830 1727204103.88284: Evaluated conditional (network_state != {}): False 13830 1727204103.88288: when evaluation is False, skipping this task 13830 1727204103.88291: _execute() done 13830 1727204103.88293: dumping result to json 13830 1727204103.88295: done dumping result, returning 13830 1727204103.88422: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1659-6b02-00000000069c] 13830 1727204103.88437: sending task result for task 0affcd87-79f5-1659-6b02-00000000069c skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204103.88605: no more pending results, returning what we have 13830 1727204103.88611: results queue empty 13830 1727204103.88612: checking for any_errors_fatal 13830 1727204103.88618: done checking for any_errors_fatal 13830 1727204103.88619: checking for max_fail_percentage 13830 1727204103.88621: done checking for max_fail_percentage 13830 1727204103.88622: checking to see if all hosts have failed and the running result is not ok 13830 1727204103.88623: done checking to see if all hosts have failed 13830 1727204103.88624: getting the remaining hosts for this loop 13830 1727204103.88626: done getting the remaining hosts for this loop 13830 1727204103.88631: getting the next task for host managed-node3 13830 1727204103.88639: done getting next task for host managed-node3 13830 1727204103.88644: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204103.88651: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204103.88670: done sending task result for task 0affcd87-79f5-1659-6b02-00000000069c 13830 1727204103.88685: getting variables 13830 1727204103.88688: in VariableManager get_vars() 13830 1727204103.88732: Calling all_inventory to load vars for managed-node3 13830 1727204103.88736: Calling groups_inventory to load vars for managed-node3 13830 1727204103.88738: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204103.88753: Calling all_plugins_play to load vars for managed-node3 13830 1727204103.88756: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204103.88761: Calling groups_plugins_play to load vars for managed-node3 13830 1727204103.89804: WORKER PROCESS EXITING 13830 1727204103.92031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204103.95399: done with get_vars() 13830 1727204103.95436: done getting variables 13830 1727204103.95501: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.115) 0:00:37.033 ***** 13830 1727204103.95544: entering _queue_task() for managed-node3/package 13830 1727204103.96612: worker is 1 (out of 1 available) 13830 1727204103.96627: exiting _queue_task() for managed-node3/package 13830 1727204103.96640: done queuing things up, now waiting for results queue to drain 13830 1727204103.96642: waiting for pending results... 13830 1727204103.97452: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204103.97714: in run() - task 0affcd87-79f5-1659-6b02-00000000069d 13830 1727204103.97728: variable 'ansible_search_path' from source: unknown 13830 1727204103.97735: variable 'ansible_search_path' from source: unknown 13830 1727204103.97884: calling self._execute() 13830 1727204103.98091: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204103.98097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204103.98106: variable 'omit' from source: magic vars 13830 1727204103.99170: variable 'ansible_distribution_major_version' from source: facts 13830 1727204103.99175: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204103.99241: variable 'network_state' from source: role '' defaults 13830 1727204103.99252: Evaluated conditional (network_state != {}): False 13830 1727204103.99255: when evaluation is False, skipping this task 13830 1727204103.99258: _execute() done 13830 1727204103.99261: dumping result to json 13830 1727204103.99373: done dumping result, returning 13830 1727204103.99386: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1659-6b02-00000000069d] 13830 1727204103.99389: sending task result for task 0affcd87-79f5-1659-6b02-00000000069d 13830 1727204103.99612: done sending task result for task 0affcd87-79f5-1659-6b02-00000000069d 13830 1727204103.99617: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204104.00516: no more pending results, returning what we have 13830 1727204104.00521: results queue empty 13830 1727204104.00521: checking for any_errors_fatal 13830 1727204104.00530: done checking for any_errors_fatal 13830 1727204104.00531: checking for max_fail_percentage 13830 1727204104.00533: done checking for max_fail_percentage 13830 1727204104.00534: checking to see if all hosts have failed and the running result is not ok 13830 1727204104.00534: done checking to see if all hosts have failed 13830 1727204104.00535: getting the remaining hosts for this loop 13830 1727204104.00537: done getting the remaining hosts for this loop 13830 1727204104.00544: getting the next task for host managed-node3 13830 1727204104.00553: done getting next task for host managed-node3 13830 1727204104.00558: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204104.00578: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204104.00600: getting variables 13830 1727204104.00602: in VariableManager get_vars() 13830 1727204104.00646: Calling all_inventory to load vars for managed-node3 13830 1727204104.00649: Calling groups_inventory to load vars for managed-node3 13830 1727204104.00652: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204104.00670: Calling all_plugins_play to load vars for managed-node3 13830 1727204104.00676: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204104.00680: Calling groups_plugins_play to load vars for managed-node3 13830 1727204104.03330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204104.07202: done with get_vars() 13830 1727204104.07238: done getting variables 13830 1727204104.07305: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.117) 0:00:37.151 ***** 13830 1727204104.07344: entering _queue_task() for managed-node3/service 13830 1727204104.08796: worker is 1 (out of 1 available) 13830 1727204104.08812: exiting _queue_task() for managed-node3/service 13830 1727204104.08827: done queuing things up, now waiting for results queue to drain 13830 1727204104.08829: waiting for pending results... 13830 1727204104.09488: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204104.09874: in run() - task 0affcd87-79f5-1659-6b02-00000000069e 13830 1727204104.09892: variable 'ansible_search_path' from source: unknown 13830 1727204104.09898: variable 'ansible_search_path' from source: unknown 13830 1727204104.09935: calling self._execute() 13830 1727204104.10151: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204104.10272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204104.10286: variable 'omit' from source: magic vars 13830 1727204104.11236: variable 'ansible_distribution_major_version' from source: facts 13830 1727204104.11361: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204104.11607: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204104.12038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204104.16425: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204104.16627: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204104.16777: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204104.16812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204104.16837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204104.17036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.17063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.17274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.17277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.17473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.17477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.17479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.17482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.17490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.17505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.17677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.17700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.17725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.17903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.17916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.18128: variable 'network_connections' from source: task vars 13830 1727204104.18144: variable 'port2_profile' from source: play vars 13830 1727204104.18219: variable 'port2_profile' from source: play vars 13830 1727204104.18229: variable 'port1_profile' from source: play vars 13830 1727204104.18289: variable 'port1_profile' from source: play vars 13830 1727204104.18296: variable 'controller_profile' from source: play vars 13830 1727204104.18358: variable 'controller_profile' from source: play vars 13830 1727204104.18427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204104.18689: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204104.18693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204104.18695: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204104.18718: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204104.18760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204104.18782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204104.18805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.18835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204104.18911: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204104.19145: variable 'network_connections' from source: task vars 13830 1727204104.19154: variable 'port2_profile' from source: play vars 13830 1727204104.19211: variable 'port2_profile' from source: play vars 13830 1727204104.19219: variable 'port1_profile' from source: play vars 13830 1727204104.19279: variable 'port1_profile' from source: play vars 13830 1727204104.19287: variable 'controller_profile' from source: play vars 13830 1727204104.19344: variable 'controller_profile' from source: play vars 13830 1727204104.19377: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204104.19388: when evaluation is False, skipping this task 13830 1727204104.19390: _execute() done 13830 1727204104.19393: dumping result to json 13830 1727204104.19396: done dumping result, returning 13830 1727204104.19398: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-00000000069e] 13830 1727204104.19401: sending task result for task 0affcd87-79f5-1659-6b02-00000000069e skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204104.19574: no more pending results, returning what we have 13830 1727204104.19578: results queue empty 13830 1727204104.19579: checking for any_errors_fatal 13830 1727204104.19587: done checking for any_errors_fatal 13830 1727204104.19588: checking for max_fail_percentage 13830 1727204104.19589: done checking for max_fail_percentage 13830 1727204104.19590: checking to see if all hosts have failed and the running result is not ok 13830 1727204104.19591: done checking to see if all hosts have failed 13830 1727204104.19592: getting the remaining hosts for this loop 13830 1727204104.19594: done getting the remaining hosts for this loop 13830 1727204104.19598: getting the next task for host managed-node3 13830 1727204104.19607: done getting next task for host managed-node3 13830 1727204104.19612: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204104.19618: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204104.19639: getting variables 13830 1727204104.19641: in VariableManager get_vars() 13830 1727204104.19682: Calling all_inventory to load vars for managed-node3 13830 1727204104.19686: Calling groups_inventory to load vars for managed-node3 13830 1727204104.19688: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204104.19700: Calling all_plugins_play to load vars for managed-node3 13830 1727204104.19703: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204104.19707: Calling groups_plugins_play to load vars for managed-node3 13830 1727204104.20899: done sending task result for task 0affcd87-79f5-1659-6b02-00000000069e 13830 1727204104.20904: WORKER PROCESS EXITING 13830 1727204104.22115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204104.24659: done with get_vars() 13830 1727204104.24691: done getting variables 13830 1727204104.24761: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.174) 0:00:37.326 ***** 13830 1727204104.24798: entering _queue_task() for managed-node3/service 13830 1727204104.25148: worker is 1 (out of 1 available) 13830 1727204104.25168: exiting _queue_task() for managed-node3/service 13830 1727204104.25181: done queuing things up, now waiting for results queue to drain 13830 1727204104.25183: waiting for pending results... 13830 1727204104.25547: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204104.25692: in run() - task 0affcd87-79f5-1659-6b02-00000000069f 13830 1727204104.25716: variable 'ansible_search_path' from source: unknown 13830 1727204104.25721: variable 'ansible_search_path' from source: unknown 13830 1727204104.25769: calling self._execute() 13830 1727204104.25870: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204104.25874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204104.25877: variable 'omit' from source: magic vars 13830 1727204104.26300: variable 'ansible_distribution_major_version' from source: facts 13830 1727204104.26342: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204104.26528: variable 'network_provider' from source: set_fact 13830 1727204104.26534: variable 'network_state' from source: role '' defaults 13830 1727204104.26537: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13830 1727204104.26539: variable 'omit' from source: magic vars 13830 1727204104.26645: variable 'omit' from source: magic vars 13830 1727204104.26662: variable 'network_service_name' from source: role '' defaults 13830 1727204104.26766: variable 'network_service_name' from source: role '' defaults 13830 1727204104.27091: variable '__network_provider_setup' from source: role '' defaults 13830 1727204104.27095: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204104.27168: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204104.27183: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204104.27247: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204104.27653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204104.31350: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204104.31934: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204104.31995: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204104.32040: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204104.32077: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204104.32180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.32224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.32259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.32317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.32340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.32394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.32435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.32468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.32515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.32545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.33666: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204104.34112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.34152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.34187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.34234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.34257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.34707: variable 'ansible_python' from source: facts 13830 1727204104.34737: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204104.35604: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204104.35745: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204104.35948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.35982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.36326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.36469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.36491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.36544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204104.36585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204104.36618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.36673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204104.36691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204104.37254: variable 'network_connections' from source: task vars 13830 1727204104.37270: variable 'port2_profile' from source: play vars 13830 1727204104.37403: variable 'port2_profile' from source: play vars 13830 1727204104.37423: variable 'port1_profile' from source: play vars 13830 1727204104.37555: variable 'port1_profile' from source: play vars 13830 1727204104.37576: variable 'controller_profile' from source: play vars 13830 1727204104.37669: variable 'controller_profile' from source: play vars 13830 1727204104.37807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204104.38035: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204104.38127: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204104.38190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204104.38246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204104.38359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204104.38402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204104.38449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204104.38498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204104.38562: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204104.38899: variable 'network_connections' from source: task vars 13830 1727204104.38912: variable 'port2_profile' from source: play vars 13830 1727204104.39076: variable 'port2_profile' from source: play vars 13830 1727204104.39122: variable 'port1_profile' from source: play vars 13830 1727204104.39395: variable 'port1_profile' from source: play vars 13830 1727204104.39413: variable 'controller_profile' from source: play vars 13830 1727204104.39502: variable 'controller_profile' from source: play vars 13830 1727204104.39578: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204104.39671: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204104.40078: variable 'network_connections' from source: task vars 13830 1727204104.40087: variable 'port2_profile' from source: play vars 13830 1727204104.40170: variable 'port2_profile' from source: play vars 13830 1727204104.40183: variable 'port1_profile' from source: play vars 13830 1727204104.40267: variable 'port1_profile' from source: play vars 13830 1727204104.40281: variable 'controller_profile' from source: play vars 13830 1727204104.40361: variable 'controller_profile' from source: play vars 13830 1727204104.40394: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204104.40514: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204104.40842: variable 'network_connections' from source: task vars 13830 1727204104.40852: variable 'port2_profile' from source: play vars 13830 1727204104.40939: variable 'port2_profile' from source: play vars 13830 1727204104.40952: variable 'port1_profile' from source: play vars 13830 1727204104.41069: variable 'port1_profile' from source: play vars 13830 1727204104.41112: variable 'controller_profile' from source: play vars 13830 1727204104.41304: variable 'controller_profile' from source: play vars 13830 1727204104.41404: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204104.41623: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204104.41638: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204104.41710: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204104.41994: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204104.43002: variable 'network_connections' from source: task vars 13830 1727204104.43019: variable 'port2_profile' from source: play vars 13830 1727204104.43195: variable 'port2_profile' from source: play vars 13830 1727204104.43213: variable 'port1_profile' from source: play vars 13830 1727204104.43281: variable 'port1_profile' from source: play vars 13830 1727204104.43293: variable 'controller_profile' from source: play vars 13830 1727204104.43437: variable 'controller_profile' from source: play vars 13830 1727204104.43617: variable 'ansible_distribution' from source: facts 13830 1727204104.43628: variable '__network_rh_distros' from source: role '' defaults 13830 1727204104.43641: variable 'ansible_distribution_major_version' from source: facts 13830 1727204104.43666: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204104.44121: variable 'ansible_distribution' from source: facts 13830 1727204104.44134: variable '__network_rh_distros' from source: role '' defaults 13830 1727204104.44149: variable 'ansible_distribution_major_version' from source: facts 13830 1727204104.44176: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204104.44411: variable 'ansible_distribution' from source: facts 13830 1727204104.44421: variable '__network_rh_distros' from source: role '' defaults 13830 1727204104.44434: variable 'ansible_distribution_major_version' from source: facts 13830 1727204104.44516: variable 'network_provider' from source: set_fact 13830 1727204104.44548: variable 'omit' from source: magic vars 13830 1727204104.44591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204104.44625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204104.44649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204104.44667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204104.44739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204104.44772: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204104.44776: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204104.44778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204104.44890: Set connection var ansible_connection to ssh 13830 1727204104.44902: Set connection var ansible_timeout to 10 13830 1727204104.44907: Set connection var ansible_shell_executable to /bin/sh 13830 1727204104.44912: Set connection var ansible_shell_type to sh 13830 1727204104.44915: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204104.44931: Set connection var ansible_pipelining to False 13830 1727204104.44963: variable 'ansible_shell_executable' from source: unknown 13830 1727204104.44970: variable 'ansible_connection' from source: unknown 13830 1727204104.44972: variable 'ansible_module_compression' from source: unknown 13830 1727204104.44975: variable 'ansible_shell_type' from source: unknown 13830 1727204104.44977: variable 'ansible_shell_executable' from source: unknown 13830 1727204104.44979: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204104.44981: variable 'ansible_pipelining' from source: unknown 13830 1727204104.44985: variable 'ansible_timeout' from source: unknown 13830 1727204104.44990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204104.45430: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204104.45444: variable 'omit' from source: magic vars 13830 1727204104.45450: starting attempt loop 13830 1727204104.45453: running the handler 13830 1727204104.46040: variable 'ansible_facts' from source: unknown 13830 1727204104.48018: _low_level_execute_command(): starting 13830 1727204104.48029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204104.48828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204104.48843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.48854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.48869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.48917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.48924: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204104.48937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.48951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204104.48958: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204104.48967: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204104.48975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.48985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.49002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.49012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.49019: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204104.49028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.49105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204104.49130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204104.49145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204104.49228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204104.50902: stdout chunk (state=3): >>>/root <<< 13830 1727204104.51091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204104.51095: stdout chunk (state=3): >>><<< 13830 1727204104.51105: stderr chunk (state=3): >>><<< 13830 1727204104.51138: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204104.51150: _low_level_execute_command(): starting 13830 1727204104.51156: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181 `" && echo ansible-tmp-1727204104.511374-16493-27913627961181="` echo /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181 `" ) && sleep 0' 13830 1727204104.51887: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204104.51904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.51915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.51929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.51974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.51982: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204104.51993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.52016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204104.52024: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204104.52031: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204104.52043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.52052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.52066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.52075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.52082: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204104.52092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.52192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204104.52207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204104.52214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204104.52296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204104.54286: stdout chunk (state=3): >>>ansible-tmp-1727204104.511374-16493-27913627961181=/root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181 <<< 13830 1727204104.54496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204104.54500: stdout chunk (state=3): >>><<< 13830 1727204104.54502: stderr chunk (state=3): >>><<< 13830 1727204104.54774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204104.511374-16493-27913627961181=/root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204104.54778: variable 'ansible_module_compression' from source: unknown 13830 1727204104.54780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13830 1727204104.54782: variable 'ansible_facts' from source: unknown 13830 1727204104.55093: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181/AnsiballZ_systemd.py 13830 1727204104.55901: Sending initial data 13830 1727204104.55905: Sent initial data (154 bytes) 13830 1727204104.58118: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204104.58169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.58184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.58202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.58291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.58343: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204104.58359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.58382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204104.58392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204104.58401: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204104.58412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.58424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.58442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.58453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.58462: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204104.58477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.58686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204104.58715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204104.58733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204104.58813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204104.60656: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204104.60684: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204104.60728: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpt4ra0w31 /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181/AnsiballZ_systemd.py <<< 13830 1727204104.60773: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204104.63599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204104.63773: stderr chunk (state=3): >>><<< 13830 1727204104.63777: stdout chunk (state=3): >>><<< 13830 1727204104.63779: done transferring module to remote 13830 1727204104.63781: _low_level_execute_command(): starting 13830 1727204104.63783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181/ /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181/AnsiballZ_systemd.py && sleep 0' 13830 1727204104.65483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204104.65498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.65519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.65537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.65581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.65626: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204104.65646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.65745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204104.65761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204104.65776: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204104.65787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204104.65799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.65814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.65824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204104.65839: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204104.65853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.65931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204104.66069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204104.66088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204104.66173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204104.68122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204104.68126: stdout chunk (state=3): >>><<< 13830 1727204104.68128: stderr chunk (state=3): >>><<< 13830 1727204104.68224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204104.68228: _low_level_execute_command(): starting 13830 1727204104.68231: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181/AnsiballZ_systemd.py && sleep 0' 13830 1727204104.71009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.71014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204104.71044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.71048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204104.71050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204104.71052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204104.71352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204104.71425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204104.71513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204104.98031: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 13830 1727204104.98063: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "15921152", "MemoryAvailable": "infinity", "CPUUsageNSec": "754515000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13830 1727204104.99790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204104.99878: stderr chunk (state=3): >>><<< 13830 1727204104.99882: stdout chunk (state=3): >>><<< 13830 1727204105.00074: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15921152", "MemoryAvailable": "infinity", "CPUUsageNSec": "754515000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204105.00183: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204105.00187: _low_level_execute_command(): starting 13830 1727204105.00190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204104.511374-16493-27913627961181/ > /dev/null 2>&1 && sleep 0' 13830 1727204105.02314: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204105.02329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.02344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.02362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.02410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.02421: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204105.02434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.02451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204105.02461: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204105.02475: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204105.02488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.02500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.02514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.02524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.02535: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204105.02550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.02627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204105.02780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204105.02797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204105.02880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204105.04794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204105.04888: stderr chunk (state=3): >>><<< 13830 1727204105.04892: stdout chunk (state=3): >>><<< 13830 1727204105.05171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204105.05175: handler run complete 13830 1727204105.05177: attempt loop complete, returning result 13830 1727204105.05179: _execute() done 13830 1727204105.05181: dumping result to json 13830 1727204105.05183: done dumping result, returning 13830 1727204105.05185: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1659-6b02-00000000069f] 13830 1727204105.05187: sending task result for task 0affcd87-79f5-1659-6b02-00000000069f ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204105.05413: no more pending results, returning what we have 13830 1727204105.05417: results queue empty 13830 1727204105.05418: checking for any_errors_fatal 13830 1727204105.05425: done checking for any_errors_fatal 13830 1727204105.05426: checking for max_fail_percentage 13830 1727204105.05427: done checking for max_fail_percentage 13830 1727204105.05428: checking to see if all hosts have failed and the running result is not ok 13830 1727204105.05429: done checking to see if all hosts have failed 13830 1727204105.05429: getting the remaining hosts for this loop 13830 1727204105.05433: done getting the remaining hosts for this loop 13830 1727204105.05437: getting the next task for host managed-node3 13830 1727204105.05444: done getting next task for host managed-node3 13830 1727204105.05448: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204105.05453: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204105.05465: getting variables 13830 1727204105.05467: in VariableManager get_vars() 13830 1727204105.05498: Calling all_inventory to load vars for managed-node3 13830 1727204105.05501: Calling groups_inventory to load vars for managed-node3 13830 1727204105.05503: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204105.05512: Calling all_plugins_play to load vars for managed-node3 13830 1727204105.05515: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204105.05517: Calling groups_plugins_play to load vars for managed-node3 13830 1727204105.07311: done sending task result for task 0affcd87-79f5-1659-6b02-00000000069f 13830 1727204105.07320: WORKER PROCESS EXITING 13830 1727204105.09473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204105.14189: done with get_vars() 13830 1727204105.14227: done getting variables 13830 1727204105.14415: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.896) 0:00:38.222 ***** 13830 1727204105.14459: entering _queue_task() for managed-node3/service 13830 1727204105.15191: worker is 1 (out of 1 available) 13830 1727204105.15205: exiting _queue_task() for managed-node3/service 13830 1727204105.15219: done queuing things up, now waiting for results queue to drain 13830 1727204105.15221: waiting for pending results... 13830 1727204105.16270: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204105.16545: in run() - task 0affcd87-79f5-1659-6b02-0000000006a0 13830 1727204105.16676: variable 'ansible_search_path' from source: unknown 13830 1727204105.16680: variable 'ansible_search_path' from source: unknown 13830 1727204105.16720: calling self._execute() 13830 1727204105.16910: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204105.16914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204105.16927: variable 'omit' from source: magic vars 13830 1727204105.17845: variable 'ansible_distribution_major_version' from source: facts 13830 1727204105.17858: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204105.18192: variable 'network_provider' from source: set_fact 13830 1727204105.18198: Evaluated conditional (network_provider == "nm"): True 13830 1727204105.18404: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204105.18504: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204105.18696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204105.24069: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204105.24259: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204105.24303: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204105.24414: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204105.24500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204105.24731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204105.24813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204105.24845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204105.24898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204105.24912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204105.24963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204105.24998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204105.25023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204105.25068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204105.25086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204105.25136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204105.25161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204105.25190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204105.25241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204105.25255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204105.25432: variable 'network_connections' from source: task vars 13830 1727204105.25495: variable 'port2_profile' from source: play vars 13830 1727204105.25584: variable 'port2_profile' from source: play vars 13830 1727204105.25597: variable 'port1_profile' from source: play vars 13830 1727204105.25687: variable 'port1_profile' from source: play vars 13830 1727204105.25696: variable 'controller_profile' from source: play vars 13830 1727204105.25771: variable 'controller_profile' from source: play vars 13830 1727204105.25848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204105.26045: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204105.26093: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204105.26123: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204105.26154: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204105.26211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204105.26234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204105.26262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204105.26302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204105.26354: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204105.26691: variable 'network_connections' from source: task vars 13830 1727204105.26694: variable 'port2_profile' from source: play vars 13830 1727204105.26828: variable 'port2_profile' from source: play vars 13830 1727204105.26838: variable 'port1_profile' from source: play vars 13830 1727204105.26926: variable 'port1_profile' from source: play vars 13830 1727204105.26985: variable 'controller_profile' from source: play vars 13830 1727204105.27097: variable 'controller_profile' from source: play vars 13830 1727204105.27129: Evaluated conditional (__network_wpa_supplicant_required): False 13830 1727204105.27133: when evaluation is False, skipping this task 13830 1727204105.27137: _execute() done 13830 1727204105.27139: dumping result to json 13830 1727204105.27144: done dumping result, returning 13830 1727204105.27153: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1659-6b02-0000000006a0] 13830 1727204105.27158: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a0 13830 1727204105.27275: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a0 13830 1727204105.27277: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13830 1727204105.27338: no more pending results, returning what we have 13830 1727204105.27342: results queue empty 13830 1727204105.27343: checking for any_errors_fatal 13830 1727204105.27389: done checking for any_errors_fatal 13830 1727204105.27390: checking for max_fail_percentage 13830 1727204105.27393: done checking for max_fail_percentage 13830 1727204105.27394: checking to see if all hosts have failed and the running result is not ok 13830 1727204105.27394: done checking to see if all hosts have failed 13830 1727204105.27395: getting the remaining hosts for this loop 13830 1727204105.27397: done getting the remaining hosts for this loop 13830 1727204105.27402: getting the next task for host managed-node3 13830 1727204105.27410: done getting next task for host managed-node3 13830 1727204105.27415: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204105.27420: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204105.27443: getting variables 13830 1727204105.27445: in VariableManager get_vars() 13830 1727204105.27491: Calling all_inventory to load vars for managed-node3 13830 1727204105.27494: Calling groups_inventory to load vars for managed-node3 13830 1727204105.27497: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204105.27509: Calling all_plugins_play to load vars for managed-node3 13830 1727204105.27512: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204105.27515: Calling groups_plugins_play to load vars for managed-node3 13830 1727204105.29343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204105.31625: done with get_vars() 13830 1727204105.31655: done getting variables 13830 1727204105.31779: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.173) 0:00:38.396 ***** 13830 1727204105.31819: entering _queue_task() for managed-node3/service 13830 1727204105.32367: worker is 1 (out of 1 available) 13830 1727204105.32381: exiting _queue_task() for managed-node3/service 13830 1727204105.32396: done queuing things up, now waiting for results queue to drain 13830 1727204105.32397: waiting for pending results... 13830 1727204105.32747: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204105.32922: in run() - task 0affcd87-79f5-1659-6b02-0000000006a1 13830 1727204105.32945: variable 'ansible_search_path' from source: unknown 13830 1727204105.32948: variable 'ansible_search_path' from source: unknown 13830 1727204105.32992: calling self._execute() 13830 1727204105.33096: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204105.33100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204105.33110: variable 'omit' from source: magic vars 13830 1727204105.33541: variable 'ansible_distribution_major_version' from source: facts 13830 1727204105.33554: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204105.33690: variable 'network_provider' from source: set_fact 13830 1727204105.33696: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204105.33699: when evaluation is False, skipping this task 13830 1727204105.33701: _execute() done 13830 1727204105.33704: dumping result to json 13830 1727204105.33708: done dumping result, returning 13830 1727204105.33724: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1659-6b02-0000000006a1] 13830 1727204105.33740: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a1 13830 1727204105.33843: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a1 13830 1727204105.33847: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204105.33900: no more pending results, returning what we have 13830 1727204105.33905: results queue empty 13830 1727204105.33906: checking for any_errors_fatal 13830 1727204105.33916: done checking for any_errors_fatal 13830 1727204105.33917: checking for max_fail_percentage 13830 1727204105.33919: done checking for max_fail_percentage 13830 1727204105.33920: checking to see if all hosts have failed and the running result is not ok 13830 1727204105.33921: done checking to see if all hosts have failed 13830 1727204105.33922: getting the remaining hosts for this loop 13830 1727204105.33924: done getting the remaining hosts for this loop 13830 1727204105.33930: getting the next task for host managed-node3 13830 1727204105.33942: done getting next task for host managed-node3 13830 1727204105.33947: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204105.33954: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204105.33977: getting variables 13830 1727204105.33979: in VariableManager get_vars() 13830 1727204105.34024: Calling all_inventory to load vars for managed-node3 13830 1727204105.34027: Calling groups_inventory to load vars for managed-node3 13830 1727204105.34030: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204105.34045: Calling all_plugins_play to load vars for managed-node3 13830 1727204105.34048: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204105.34051: Calling groups_plugins_play to load vars for managed-node3 13830 1727204105.35926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204105.38356: done with get_vars() 13830 1727204105.38396: done getting variables 13830 1727204105.38467: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.066) 0:00:38.463 ***** 13830 1727204105.38504: entering _queue_task() for managed-node3/copy 13830 1727204105.38905: worker is 1 (out of 1 available) 13830 1727204105.38918: exiting _queue_task() for managed-node3/copy 13830 1727204105.38930: done queuing things up, now waiting for results queue to drain 13830 1727204105.38934: waiting for pending results... 13830 1727204105.39295: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204105.39471: in run() - task 0affcd87-79f5-1659-6b02-0000000006a2 13830 1727204105.39491: variable 'ansible_search_path' from source: unknown 13830 1727204105.39495: variable 'ansible_search_path' from source: unknown 13830 1727204105.39533: calling self._execute() 13830 1727204105.39647: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204105.39653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204105.39675: variable 'omit' from source: magic vars 13830 1727204105.40139: variable 'ansible_distribution_major_version' from source: facts 13830 1727204105.40161: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204105.40298: variable 'network_provider' from source: set_fact 13830 1727204105.40302: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204105.40305: when evaluation is False, skipping this task 13830 1727204105.40308: _execute() done 13830 1727204105.40310: dumping result to json 13830 1727204105.40314: done dumping result, returning 13830 1727204105.40327: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1659-6b02-0000000006a2] 13830 1727204105.40337: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a2 13830 1727204105.40457: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a2 13830 1727204105.40460: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13830 1727204105.40521: no more pending results, returning what we have 13830 1727204105.40526: results queue empty 13830 1727204105.40527: checking for any_errors_fatal 13830 1727204105.40540: done checking for any_errors_fatal 13830 1727204105.40542: checking for max_fail_percentage 13830 1727204105.40544: done checking for max_fail_percentage 13830 1727204105.40545: checking to see if all hosts have failed and the running result is not ok 13830 1727204105.40546: done checking to see if all hosts have failed 13830 1727204105.40546: getting the remaining hosts for this loop 13830 1727204105.40549: done getting the remaining hosts for this loop 13830 1727204105.40553: getting the next task for host managed-node3 13830 1727204105.40562: done getting next task for host managed-node3 13830 1727204105.40568: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204105.40576: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204105.40597: getting variables 13830 1727204105.40599: in VariableManager get_vars() 13830 1727204105.40648: Calling all_inventory to load vars for managed-node3 13830 1727204105.40651: Calling groups_inventory to load vars for managed-node3 13830 1727204105.40653: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204105.40667: Calling all_plugins_play to load vars for managed-node3 13830 1727204105.40671: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204105.40674: Calling groups_plugins_play to load vars for managed-node3 13830 1727204105.44452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204105.47846: done with get_vars() 13830 1727204105.47897: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.094) 0:00:38.558 ***** 13830 1727204105.48000: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204105.48366: worker is 1 (out of 1 available) 13830 1727204105.48381: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204105.48394: done queuing things up, now waiting for results queue to drain 13830 1727204105.48395: waiting for pending results... 13830 1727204105.48712: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204105.48876: in run() - task 0affcd87-79f5-1659-6b02-0000000006a3 13830 1727204105.48891: variable 'ansible_search_path' from source: unknown 13830 1727204105.48898: variable 'ansible_search_path' from source: unknown 13830 1727204105.48931: calling self._execute() 13830 1727204105.49048: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204105.49057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204105.49069: variable 'omit' from source: magic vars 13830 1727204105.49423: variable 'ansible_distribution_major_version' from source: facts 13830 1727204105.49437: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204105.49444: variable 'omit' from source: magic vars 13830 1727204105.49519: variable 'omit' from source: magic vars 13830 1727204105.49683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204105.52391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204105.52458: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204105.52509: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204105.52545: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204105.52574: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204105.52662: variable 'network_provider' from source: set_fact 13830 1727204105.52802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204105.52833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204105.52860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204105.52896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204105.52910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204105.52992: variable 'omit' from source: magic vars 13830 1727204105.53105: variable 'omit' from source: magic vars 13830 1727204105.53210: variable 'network_connections' from source: task vars 13830 1727204105.53218: variable 'port2_profile' from source: play vars 13830 1727204105.53361: variable 'port2_profile' from source: play vars 13830 1727204105.53376: variable 'port1_profile' from source: play vars 13830 1727204105.53430: variable 'port1_profile' from source: play vars 13830 1727204105.53440: variable 'controller_profile' from source: play vars 13830 1727204105.53501: variable 'controller_profile' from source: play vars 13830 1727204105.53659: variable 'omit' from source: magic vars 13830 1727204105.53668: variable '__lsr_ansible_managed' from source: task vars 13830 1727204105.53737: variable '__lsr_ansible_managed' from source: task vars 13830 1727204105.53987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13830 1727204105.54219: Loaded config def from plugin (lookup/template) 13830 1727204105.54222: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13830 1727204105.54259: File lookup term: get_ansible_managed.j2 13830 1727204105.54262: variable 'ansible_search_path' from source: unknown 13830 1727204105.54267: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13830 1727204105.54282: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13830 1727204105.54313: variable 'ansible_search_path' from source: unknown 13830 1727204105.61089: variable 'ansible_managed' from source: unknown 13830 1727204105.61257: variable 'omit' from source: magic vars 13830 1727204105.61287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204105.61318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204105.61340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204105.61358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204105.61370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204105.61399: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204105.61403: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204105.61405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204105.61507: Set connection var ansible_connection to ssh 13830 1727204105.61518: Set connection var ansible_timeout to 10 13830 1727204105.61524: Set connection var ansible_shell_executable to /bin/sh 13830 1727204105.61526: Set connection var ansible_shell_type to sh 13830 1727204105.61543: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204105.61551: Set connection var ansible_pipelining to False 13830 1727204105.61580: variable 'ansible_shell_executable' from source: unknown 13830 1727204105.61583: variable 'ansible_connection' from source: unknown 13830 1727204105.61586: variable 'ansible_module_compression' from source: unknown 13830 1727204105.61588: variable 'ansible_shell_type' from source: unknown 13830 1727204105.61590: variable 'ansible_shell_executable' from source: unknown 13830 1727204105.61592: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204105.61594: variable 'ansible_pipelining' from source: unknown 13830 1727204105.61598: variable 'ansible_timeout' from source: unknown 13830 1727204105.61612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204105.61745: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204105.61760: variable 'omit' from source: magic vars 13830 1727204105.61770: starting attempt loop 13830 1727204105.61775: running the handler 13830 1727204105.61786: _low_level_execute_command(): starting 13830 1727204105.61793: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204105.62582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204105.62595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.62606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.62623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.62673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.62680: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204105.62691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.62704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204105.62711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204105.62718: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204105.62726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.62740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.62757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.62766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.62773: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204105.62783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.62860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204105.62880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204105.62885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204105.62979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204105.64656: stdout chunk (state=3): >>>/root <<< 13830 1727204105.64968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204105.64972: stdout chunk (state=3): >>><<< 13830 1727204105.64982: stderr chunk (state=3): >>><<< 13830 1727204105.65004: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204105.65016: _low_level_execute_command(): starting 13830 1727204105.65023: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801 `" && echo ansible-tmp-1727204105.650048-16553-272351565180801="` echo /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801 `" ) && sleep 0' 13830 1727204105.65694: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204105.65703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.65714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.65729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.65780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.65786: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204105.65797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.65811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204105.65818: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204105.65825: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204105.65833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.65847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.65856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.65872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.65878: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204105.65888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.65977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204105.65981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204105.65983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204105.66065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204105.67943: stdout chunk (state=3): >>>ansible-tmp-1727204105.650048-16553-272351565180801=/root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801 <<< 13830 1727204105.68058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204105.68144: stderr chunk (state=3): >>><<< 13830 1727204105.68147: stdout chunk (state=3): >>><<< 13830 1727204105.68171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204105.650048-16553-272351565180801=/root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204105.68215: variable 'ansible_module_compression' from source: unknown 13830 1727204105.68269: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13830 1727204105.68306: variable 'ansible_facts' from source: unknown 13830 1727204105.68419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801/AnsiballZ_network_connections.py 13830 1727204105.68562: Sending initial data 13830 1727204105.68567: Sent initial data (167 bytes) 13830 1727204105.69578: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204105.69589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.69596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.69612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.69656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.69663: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204105.69678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.69698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204105.69701: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204105.69704: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204105.69710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.69727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.69737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.69745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.69752: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204105.69761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.69832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204105.69849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204105.69859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204105.70236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204105.71735: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204105.71789: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204105.71825: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpjooteqag /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801/AnsiballZ_network_connections.py <<< 13830 1727204105.71844: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204105.73560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204105.73773: stderr chunk (state=3): >>><<< 13830 1727204105.73776: stdout chunk (state=3): >>><<< 13830 1727204105.73779: done transferring module to remote 13830 1727204105.73781: _low_level_execute_command(): starting 13830 1727204105.73783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801/ /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801/AnsiballZ_network_connections.py && sleep 0' 13830 1727204105.74487: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204105.74503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.74518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.74555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.74604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.74617: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204105.74639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.74666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204105.74681: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204105.74692: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204105.74705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.74723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.74744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.74767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.74779: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204105.74793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.74883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204105.74908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204105.74925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204105.75018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204105.76885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204105.76940: stderr chunk (state=3): >>><<< 13830 1727204105.76944: stdout chunk (state=3): >>><<< 13830 1727204105.77050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204105.77058: _low_level_execute_command(): starting 13830 1727204105.77061: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801/AnsiballZ_network_connections.py && sleep 0' 13830 1727204105.77735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204105.77749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.77762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.77779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.77828: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.77845: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204105.77857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.77874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204105.77884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204105.77894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204105.77905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204105.77924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204105.77941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204105.77955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204105.77967: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204105.77979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204105.78071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204105.78093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204105.78107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204105.78191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204106.29788: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 13830 1727204106.29807: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/1316a51f-04e4-4493-b9cd-1041de4c4b19: error=unknown <<< 13830 1727204106.31889: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/332b7a11-84b4-4fa6-9593-05efe3c41549: error=unknown <<< 13830 1727204106.34103: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 13830 1727204106.34107: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a426d8cc-5539-4594-a8cb-0bd7ae20a9f8: error=unknown <<< 13830 1727204106.34312: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13830 1727204106.36286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204106.36346: stderr chunk (state=3): >>><<< 13830 1727204106.36350: stdout chunk (state=3): >>><<< 13830 1727204106.36515: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/1316a51f-04e4-4493-b9cd-1041de4c4b19: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/332b7a11-84b4-4fa6-9593-05efe3c41549: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xw6co8lp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a426d8cc-5539-4594-a8cb-0bd7ae20a9f8: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204106.36525: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204106.36527: _low_level_execute_command(): starting 13830 1727204106.36530: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204105.650048-16553-272351565180801/ > /dev/null 2>&1 && sleep 0' 13830 1727204106.37989: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204106.38095: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204106.38128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204106.38266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204106.38432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204106.40384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204106.40470: stderr chunk (state=3): >>><<< 13830 1727204106.40474: stdout chunk (state=3): >>><<< 13830 1727204106.40676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204106.40680: handler run complete 13830 1727204106.40682: attempt loop complete, returning result 13830 1727204106.40684: _execute() done 13830 1727204106.40686: dumping result to json 13830 1727204106.40688: done dumping result, returning 13830 1727204106.40691: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1659-6b02-0000000006a3] 13830 1727204106.40693: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a3 13830 1727204106.40776: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a3 13830 1727204106.40780: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13830 1727204106.40904: no more pending results, returning what we have 13830 1727204106.40908: results queue empty 13830 1727204106.40909: checking for any_errors_fatal 13830 1727204106.40918: done checking for any_errors_fatal 13830 1727204106.40919: checking for max_fail_percentage 13830 1727204106.40921: done checking for max_fail_percentage 13830 1727204106.40922: checking to see if all hosts have failed and the running result is not ok 13830 1727204106.40923: done checking to see if all hosts have failed 13830 1727204106.40923: getting the remaining hosts for this loop 13830 1727204106.40925: done getting the remaining hosts for this loop 13830 1727204106.40929: getting the next task for host managed-node3 13830 1727204106.40938: done getting next task for host managed-node3 13830 1727204106.40942: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204106.40947: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204106.40959: getting variables 13830 1727204106.40961: in VariableManager get_vars() 13830 1727204106.41004: Calling all_inventory to load vars for managed-node3 13830 1727204106.41007: Calling groups_inventory to load vars for managed-node3 13830 1727204106.41010: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204106.41020: Calling all_plugins_play to load vars for managed-node3 13830 1727204106.41028: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204106.41031: Calling groups_plugins_play to load vars for managed-node3 13830 1727204106.43115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204106.45220: done with get_vars() 13830 1727204106.45256: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.973) 0:00:39.531 ***** 13830 1727204106.45377: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204106.45743: worker is 1 (out of 1 available) 13830 1727204106.45758: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204106.45778: done queuing things up, now waiting for results queue to drain 13830 1727204106.45780: waiting for pending results... 13830 1727204106.46090: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204106.46259: in run() - task 0affcd87-79f5-1659-6b02-0000000006a4 13830 1727204106.46276: variable 'ansible_search_path' from source: unknown 13830 1727204106.46280: variable 'ansible_search_path' from source: unknown 13830 1727204106.46320: calling self._execute() 13830 1727204106.46408: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.46413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.46426: variable 'omit' from source: magic vars 13830 1727204106.46806: variable 'ansible_distribution_major_version' from source: facts 13830 1727204106.46820: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204106.46943: variable 'network_state' from source: role '' defaults 13830 1727204106.46955: Evaluated conditional (network_state != {}): False 13830 1727204106.46959: when evaluation is False, skipping this task 13830 1727204106.46961: _execute() done 13830 1727204106.46965: dumping result to json 13830 1727204106.46968: done dumping result, returning 13830 1727204106.46980: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1659-6b02-0000000006a4] 13830 1727204106.46987: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a4 13830 1727204106.47084: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a4 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204106.47139: no more pending results, returning what we have 13830 1727204106.47143: results queue empty 13830 1727204106.47144: checking for any_errors_fatal 13830 1727204106.47157: done checking for any_errors_fatal 13830 1727204106.47158: checking for max_fail_percentage 13830 1727204106.47160: done checking for max_fail_percentage 13830 1727204106.47161: checking to see if all hosts have failed and the running result is not ok 13830 1727204106.47161: done checking to see if all hosts have failed 13830 1727204106.47162: getting the remaining hosts for this loop 13830 1727204106.47165: done getting the remaining hosts for this loop 13830 1727204106.47169: getting the next task for host managed-node3 13830 1727204106.47177: done getting next task for host managed-node3 13830 1727204106.47181: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204106.47188: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204106.47210: WORKER PROCESS EXITING 13830 1727204106.47221: getting variables 13830 1727204106.47223: in VariableManager get_vars() 13830 1727204106.47268: Calling all_inventory to load vars for managed-node3 13830 1727204106.47272: Calling groups_inventory to load vars for managed-node3 13830 1727204106.47274: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204106.47287: Calling all_plugins_play to load vars for managed-node3 13830 1727204106.47289: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204106.47291: Calling groups_plugins_play to load vars for managed-node3 13830 1727204106.49316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204106.52037: done with get_vars() 13830 1727204106.52188: done getting variables 13830 1727204106.52253: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.070) 0:00:39.602 ***** 13830 1727204106.52402: entering _queue_task() for managed-node3/debug 13830 1727204106.53094: worker is 1 (out of 1 available) 13830 1727204106.53106: exiting _queue_task() for managed-node3/debug 13830 1727204106.53118: done queuing things up, now waiting for results queue to drain 13830 1727204106.53119: waiting for pending results... 13830 1727204106.54179: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204106.54445: in run() - task 0affcd87-79f5-1659-6b02-0000000006a5 13830 1727204106.54581: variable 'ansible_search_path' from source: unknown 13830 1727204106.54586: variable 'ansible_search_path' from source: unknown 13830 1727204106.54622: calling self._execute() 13830 1727204106.54842: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.54847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.54856: variable 'omit' from source: magic vars 13830 1727204106.55720: variable 'ansible_distribution_major_version' from source: facts 13830 1727204106.55736: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204106.55740: variable 'omit' from source: magic vars 13830 1727204106.55938: variable 'omit' from source: magic vars 13830 1727204106.55975: variable 'omit' from source: magic vars 13830 1727204106.56144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204106.56182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204106.56319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204106.56338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204106.56349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204106.56383: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204106.56387: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.56389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.56611: Set connection var ansible_connection to ssh 13830 1727204106.56622: Set connection var ansible_timeout to 10 13830 1727204106.56629: Set connection var ansible_shell_executable to /bin/sh 13830 1727204106.56634: Set connection var ansible_shell_type to sh 13830 1727204106.56637: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204106.56767: Set connection var ansible_pipelining to False 13830 1727204106.56792: variable 'ansible_shell_executable' from source: unknown 13830 1727204106.56795: variable 'ansible_connection' from source: unknown 13830 1727204106.56798: variable 'ansible_module_compression' from source: unknown 13830 1727204106.56801: variable 'ansible_shell_type' from source: unknown 13830 1727204106.56803: variable 'ansible_shell_executable' from source: unknown 13830 1727204106.56805: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.56810: variable 'ansible_pipelining' from source: unknown 13830 1727204106.56812: variable 'ansible_timeout' from source: unknown 13830 1727204106.56816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.57067: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204106.57078: variable 'omit' from source: magic vars 13830 1727204106.57202: starting attempt loop 13830 1727204106.57205: running the handler 13830 1727204106.57449: variable '__network_connections_result' from source: set_fact 13830 1727204106.57502: handler run complete 13830 1727204106.57636: attempt loop complete, returning result 13830 1727204106.57641: _execute() done 13830 1727204106.57643: dumping result to json 13830 1727204106.57646: done dumping result, returning 13830 1727204106.57651: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1659-6b02-0000000006a5] 13830 1727204106.57657: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a5 13830 1727204106.57767: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a5 13830 1727204106.57771: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13830 1727204106.57855: no more pending results, returning what we have 13830 1727204106.57859: results queue empty 13830 1727204106.57860: checking for any_errors_fatal 13830 1727204106.57870: done checking for any_errors_fatal 13830 1727204106.57871: checking for max_fail_percentage 13830 1727204106.57872: done checking for max_fail_percentage 13830 1727204106.57873: checking to see if all hosts have failed and the running result is not ok 13830 1727204106.57874: done checking to see if all hosts have failed 13830 1727204106.57875: getting the remaining hosts for this loop 13830 1727204106.57876: done getting the remaining hosts for this loop 13830 1727204106.57880: getting the next task for host managed-node3 13830 1727204106.57889: done getting next task for host managed-node3 13830 1727204106.57893: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204106.57898: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204106.57909: getting variables 13830 1727204106.57910: in VariableManager get_vars() 13830 1727204106.57948: Calling all_inventory to load vars for managed-node3 13830 1727204106.57951: Calling groups_inventory to load vars for managed-node3 13830 1727204106.57953: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204106.57965: Calling all_plugins_play to load vars for managed-node3 13830 1727204106.57968: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204106.57971: Calling groups_plugins_play to load vars for managed-node3 13830 1727204106.60733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204106.64359: done with get_vars() 13830 1727204106.64398: done getting variables 13830 1727204106.64578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.122) 0:00:39.724 ***** 13830 1727204106.64627: entering _queue_task() for managed-node3/debug 13830 1727204106.65318: worker is 1 (out of 1 available) 13830 1727204106.65444: exiting _queue_task() for managed-node3/debug 13830 1727204106.65457: done queuing things up, now waiting for results queue to drain 13830 1727204106.65459: waiting for pending results... 13830 1727204106.67409: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204106.67589: in run() - task 0affcd87-79f5-1659-6b02-0000000006a6 13830 1727204106.67618: variable 'ansible_search_path' from source: unknown 13830 1727204106.67626: variable 'ansible_search_path' from source: unknown 13830 1727204106.67672: calling self._execute() 13830 1727204106.67777: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.67788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.67801: variable 'omit' from source: magic vars 13830 1727204106.68462: variable 'ansible_distribution_major_version' from source: facts 13830 1727204106.68608: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204106.68619: variable 'omit' from source: magic vars 13830 1727204106.68700: variable 'omit' from source: magic vars 13830 1727204106.68846: variable 'omit' from source: magic vars 13830 1727204106.68962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204106.69048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204106.69108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204106.69269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204106.69286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204106.69319: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204106.69327: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.69337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.69447: Set connection var ansible_connection to ssh 13830 1727204106.69588: Set connection var ansible_timeout to 10 13830 1727204106.69599: Set connection var ansible_shell_executable to /bin/sh 13830 1727204106.69605: Set connection var ansible_shell_type to sh 13830 1727204106.69615: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204106.69628: Set connection var ansible_pipelining to False 13830 1727204106.69712: variable 'ansible_shell_executable' from source: unknown 13830 1727204106.69724: variable 'ansible_connection' from source: unknown 13830 1727204106.69734: variable 'ansible_module_compression' from source: unknown 13830 1727204106.69741: variable 'ansible_shell_type' from source: unknown 13830 1727204106.69748: variable 'ansible_shell_executable' from source: unknown 13830 1727204106.69775: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.69785: variable 'ansible_pipelining' from source: unknown 13830 1727204106.69800: variable 'ansible_timeout' from source: unknown 13830 1727204106.69809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.70153: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204106.70174: variable 'omit' from source: magic vars 13830 1727204106.70240: starting attempt loop 13830 1727204106.70248: running the handler 13830 1727204106.70303: variable '__network_connections_result' from source: set_fact 13830 1727204106.70523: variable '__network_connections_result' from source: set_fact 13830 1727204106.70804: handler run complete 13830 1727204106.70906: attempt loop complete, returning result 13830 1727204106.70913: _execute() done 13830 1727204106.70919: dumping result to json 13830 1727204106.70927: done dumping result, returning 13830 1727204106.70944: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1659-6b02-0000000006a6] 13830 1727204106.71003: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a6 ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13830 1727204106.71230: no more pending results, returning what we have 13830 1727204106.71234: results queue empty 13830 1727204106.71235: checking for any_errors_fatal 13830 1727204106.71245: done checking for any_errors_fatal 13830 1727204106.71246: checking for max_fail_percentage 13830 1727204106.71248: done checking for max_fail_percentage 13830 1727204106.71249: checking to see if all hosts have failed and the running result is not ok 13830 1727204106.71249: done checking to see if all hosts have failed 13830 1727204106.71250: getting the remaining hosts for this loop 13830 1727204106.71252: done getting the remaining hosts for this loop 13830 1727204106.71256: getting the next task for host managed-node3 13830 1727204106.71267: done getting next task for host managed-node3 13830 1727204106.71271: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204106.71276: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204106.71291: getting variables 13830 1727204106.71293: in VariableManager get_vars() 13830 1727204106.71329: Calling all_inventory to load vars for managed-node3 13830 1727204106.71332: Calling groups_inventory to load vars for managed-node3 13830 1727204106.71334: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204106.71345: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a6 13830 1727204106.71359: Calling all_plugins_play to load vars for managed-node3 13830 1727204106.71368: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204106.71372: Calling groups_plugins_play to load vars for managed-node3 13830 1727204106.72156: WORKER PROCESS EXITING 13830 1727204106.74191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204106.77659: done with get_vars() 13830 1727204106.77693: done getting variables 13830 1727204106.77878: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.132) 0:00:39.857 ***** 13830 1727204106.77917: entering _queue_task() for managed-node3/debug 13830 1727204106.78732: worker is 1 (out of 1 available) 13830 1727204106.78747: exiting _queue_task() for managed-node3/debug 13830 1727204106.78760: done queuing things up, now waiting for results queue to drain 13830 1727204106.78762: waiting for pending results... 13830 1727204106.79535: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204106.79924: in run() - task 0affcd87-79f5-1659-6b02-0000000006a7 13830 1727204106.79950: variable 'ansible_search_path' from source: unknown 13830 1727204106.79986: variable 'ansible_search_path' from source: unknown 13830 1727204106.80119: calling self._execute() 13830 1727204106.80248: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.80312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.80334: variable 'omit' from source: magic vars 13830 1727204106.81057: variable 'ansible_distribution_major_version' from source: facts 13830 1727204106.81187: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204106.81514: variable 'network_state' from source: role '' defaults 13830 1727204106.81539: Evaluated conditional (network_state != {}): False 13830 1727204106.81548: when evaluation is False, skipping this task 13830 1727204106.81556: _execute() done 13830 1727204106.81563: dumping result to json 13830 1727204106.81572: done dumping result, returning 13830 1727204106.81582: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1659-6b02-0000000006a7] 13830 1727204106.81593: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a7 skipping: [managed-node3] => { "false_condition": "network_state != {}" } 13830 1727204106.81776: no more pending results, returning what we have 13830 1727204106.81781: results queue empty 13830 1727204106.81782: checking for any_errors_fatal 13830 1727204106.81794: done checking for any_errors_fatal 13830 1727204106.81795: checking for max_fail_percentage 13830 1727204106.81797: done checking for max_fail_percentage 13830 1727204106.81798: checking to see if all hosts have failed and the running result is not ok 13830 1727204106.81798: done checking to see if all hosts have failed 13830 1727204106.81799: getting the remaining hosts for this loop 13830 1727204106.81801: done getting the remaining hosts for this loop 13830 1727204106.81805: getting the next task for host managed-node3 13830 1727204106.81815: done getting next task for host managed-node3 13830 1727204106.81820: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204106.81826: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204106.81851: getting variables 13830 1727204106.81853: in VariableManager get_vars() 13830 1727204106.81892: Calling all_inventory to load vars for managed-node3 13830 1727204106.81895: Calling groups_inventory to load vars for managed-node3 13830 1727204106.81897: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204106.81910: Calling all_plugins_play to load vars for managed-node3 13830 1727204106.81913: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204106.81917: Calling groups_plugins_play to load vars for managed-node3 13830 1727204106.82501: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a7 13830 1727204106.82504: WORKER PROCESS EXITING 13830 1727204106.83501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204106.85509: done with get_vars() 13830 1727204106.85541: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.077) 0:00:39.934 ***** 13830 1727204106.85659: entering _queue_task() for managed-node3/ping 13830 1727204106.86018: worker is 1 (out of 1 available) 13830 1727204106.86031: exiting _queue_task() for managed-node3/ping 13830 1727204106.86043: done queuing things up, now waiting for results queue to drain 13830 1727204106.86044: waiting for pending results... 13830 1727204106.86379: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204106.86547: in run() - task 0affcd87-79f5-1659-6b02-0000000006a8 13830 1727204106.86561: variable 'ansible_search_path' from source: unknown 13830 1727204106.86566: variable 'ansible_search_path' from source: unknown 13830 1727204106.86604: calling self._execute() 13830 1727204106.86712: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.86716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.86736: variable 'omit' from source: magic vars 13830 1727204106.87135: variable 'ansible_distribution_major_version' from source: facts 13830 1727204106.87145: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204106.87151: variable 'omit' from source: magic vars 13830 1727204106.87237: variable 'omit' from source: magic vars 13830 1727204106.87272: variable 'omit' from source: magic vars 13830 1727204106.87321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204106.87355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204106.87381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204106.87402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204106.87418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204106.87448: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204106.87452: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.87454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.87567: Set connection var ansible_connection to ssh 13830 1727204106.87577: Set connection var ansible_timeout to 10 13830 1727204106.87582: Set connection var ansible_shell_executable to /bin/sh 13830 1727204106.87585: Set connection var ansible_shell_type to sh 13830 1727204106.87597: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204106.87613: Set connection var ansible_pipelining to False 13830 1727204106.87641: variable 'ansible_shell_executable' from source: unknown 13830 1727204106.87644: variable 'ansible_connection' from source: unknown 13830 1727204106.87647: variable 'ansible_module_compression' from source: unknown 13830 1727204106.87649: variable 'ansible_shell_type' from source: unknown 13830 1727204106.87652: variable 'ansible_shell_executable' from source: unknown 13830 1727204106.87654: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204106.87658: variable 'ansible_pipelining' from source: unknown 13830 1727204106.87660: variable 'ansible_timeout' from source: unknown 13830 1727204106.87668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204106.88673: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204106.88678: variable 'omit' from source: magic vars 13830 1727204106.88680: starting attempt loop 13830 1727204106.88683: running the handler 13830 1727204106.88685: _low_level_execute_command(): starting 13830 1727204106.88687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204106.88974: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204106.88982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204106.88985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204106.88988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204106.88990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204106.88996: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204106.89006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204106.89018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204106.89033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204106.89038: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204106.89050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204106.89059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204106.89072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204106.89317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204106.89321: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204106.89323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204106.89325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204106.89327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204106.89328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204106.89345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204106.91033: stdout chunk (state=3): >>>/root <<< 13830 1727204106.91187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204106.91235: stderr chunk (state=3): >>><<< 13830 1727204106.91238: stdout chunk (state=3): >>><<< 13830 1727204106.91356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204106.91360: _low_level_execute_command(): starting 13830 1727204106.91366: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609 `" && echo ansible-tmp-1727204106.9125814-16606-233346930946609="` echo /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609 `" ) && sleep 0' 13830 1727204106.91899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204106.91903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204106.91933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204106.91937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204106.91940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204106.92007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204106.92023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204106.92103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204106.94072: stdout chunk (state=3): >>>ansible-tmp-1727204106.9125814-16606-233346930946609=/root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609 <<< 13830 1727204106.94276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204106.94279: stdout chunk (state=3): >>><<< 13830 1727204106.94282: stderr chunk (state=3): >>><<< 13830 1727204106.94637: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204106.9125814-16606-233346930946609=/root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204106.94642: variable 'ansible_module_compression' from source: unknown 13830 1727204106.94644: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13830 1727204106.94646: variable 'ansible_facts' from source: unknown 13830 1727204106.94648: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609/AnsiballZ_ping.py 13830 1727204106.94710: Sending initial data 13830 1727204106.94717: Sent initial data (153 bytes) 13830 1727204106.95668: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204106.95684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204106.95697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204106.95712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204106.95756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204106.95771: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204106.95787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204106.95803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204106.95813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204106.95822: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204106.95835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204106.95847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204106.95860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204106.95882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204106.95892: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204106.95905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204106.95981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204106.96002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204106.96017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204106.96095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204106.97995: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204106.98015: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204106.98083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpixg5bvwf /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609/AnsiballZ_ping.py <<< 13830 1727204106.98112: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204106.99294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204106.99475: stderr chunk (state=3): >>><<< 13830 1727204106.99478: stdout chunk (state=3): >>><<< 13830 1727204106.99480: done transferring module to remote 13830 1727204106.99483: _low_level_execute_command(): starting 13830 1727204106.99485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609/ /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609/AnsiballZ_ping.py && sleep 0' 13830 1727204107.00762: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.00773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.00784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.00796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.00843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.00896: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.00904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.00917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.00936: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.00954: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.00962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.00972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.00984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.01011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.01016: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.01037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.01186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.01200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.01216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.01297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.03233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.03238: stdout chunk (state=3): >>><<< 13830 1727204107.03240: stderr chunk (state=3): >>><<< 13830 1727204107.03243: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.03245: _low_level_execute_command(): starting 13830 1727204107.03247: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609/AnsiballZ_ping.py && sleep 0' 13830 1727204107.04723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.04728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.04733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.04736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.04774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.04780: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.04794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.04804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.04811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.04817: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.04825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.04898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.04909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.04918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.04924: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.04936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.05003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.05155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.05158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.05174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.18375: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13830 1727204107.19574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204107.19579: stdout chunk (state=3): >>><<< 13830 1727204107.19585: stderr chunk (state=3): >>><<< 13830 1727204107.19605: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204107.19633: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204107.19639: _low_level_execute_command(): starting 13830 1727204107.19646: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204106.9125814-16606-233346930946609/ > /dev/null 2>&1 && sleep 0' 13830 1727204107.20273: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.20282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.20322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.20325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.20362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.20367: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.20369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.20418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.20422: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.20454: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.20457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.20459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.20461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.20465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.20468: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.20470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.21116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.21120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.21122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.21124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.23188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.23194: stdout chunk (state=3): >>><<< 13830 1727204107.23196: stderr chunk (state=3): >>><<< 13830 1727204107.23243: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.23247: handler run complete 13830 1727204107.23475: attempt loop complete, returning result 13830 1727204107.23478: _execute() done 13830 1727204107.23481: dumping result to json 13830 1727204107.23483: done dumping result, returning 13830 1727204107.23485: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1659-6b02-0000000006a8] 13830 1727204107.23487: sending task result for task 0affcd87-79f5-1659-6b02-0000000006a8 13830 1727204107.23565: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006a8 13830 1727204107.23569: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 13830 1727204107.23644: no more pending results, returning what we have 13830 1727204107.23648: results queue empty 13830 1727204107.23649: checking for any_errors_fatal 13830 1727204107.23654: done checking for any_errors_fatal 13830 1727204107.23655: checking for max_fail_percentage 13830 1727204107.23657: done checking for max_fail_percentage 13830 1727204107.23658: checking to see if all hosts have failed and the running result is not ok 13830 1727204107.23659: done checking to see if all hosts have failed 13830 1727204107.23659: getting the remaining hosts for this loop 13830 1727204107.23661: done getting the remaining hosts for this loop 13830 1727204107.23672: getting the next task for host managed-node3 13830 1727204107.23683: done getting next task for host managed-node3 13830 1727204107.23685: ^ task is: TASK: meta (role_complete) 13830 1727204107.23690: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204107.23702: getting variables 13830 1727204107.23704: in VariableManager get_vars() 13830 1727204107.23747: Calling all_inventory to load vars for managed-node3 13830 1727204107.23750: Calling groups_inventory to load vars for managed-node3 13830 1727204107.23752: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204107.23763: Calling all_plugins_play to load vars for managed-node3 13830 1727204107.23767: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204107.23771: Calling groups_plugins_play to load vars for managed-node3 13830 1727204107.25791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204107.27740: done with get_vars() 13830 1727204107.27771: done getting variables 13830 1727204107.27870: done queuing things up, now waiting for results queue to drain 13830 1727204107.27872: results queue empty 13830 1727204107.27873: checking for any_errors_fatal 13830 1727204107.27876: done checking for any_errors_fatal 13830 1727204107.27877: checking for max_fail_percentage 13830 1727204107.27879: done checking for max_fail_percentage 13830 1727204107.27880: checking to see if all hosts have failed and the running result is not ok 13830 1727204107.27880: done checking to see if all hosts have failed 13830 1727204107.27881: getting the remaining hosts for this loop 13830 1727204107.27882: done getting the remaining hosts for this loop 13830 1727204107.27885: getting the next task for host managed-node3 13830 1727204107.27898: done getting next task for host managed-node3 13830 1727204107.27901: ^ task is: TASK: Delete the device '{{ controller_device }}' 13830 1727204107.27903: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204107.27906: getting variables 13830 1727204107.27907: in VariableManager get_vars() 13830 1727204107.27921: Calling all_inventory to load vars for managed-node3 13830 1727204107.27923: Calling groups_inventory to load vars for managed-node3 13830 1727204107.27925: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204107.27930: Calling all_plugins_play to load vars for managed-node3 13830 1727204107.27936: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204107.27939: Calling groups_plugins_play to load vars for managed-node3 13830 1727204107.29406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204107.31161: done with get_vars() 13830 1727204107.31191: done getting variables 13830 1727204107.31254: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204107.31400: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.457) 0:00:40.392 ***** 13830 1727204107.31446: entering _queue_task() for managed-node3/command 13830 1727204107.31889: worker is 1 (out of 1 available) 13830 1727204107.31901: exiting _queue_task() for managed-node3/command 13830 1727204107.31913: done queuing things up, now waiting for results queue to drain 13830 1727204107.31915: waiting for pending results... 13830 1727204107.32210: running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' 13830 1727204107.32327: in run() - task 0affcd87-79f5-1659-6b02-0000000006d8 13830 1727204107.32340: variable 'ansible_search_path' from source: unknown 13830 1727204107.32344: variable 'ansible_search_path' from source: unknown 13830 1727204107.32387: calling self._execute() 13830 1727204107.32478: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204107.32482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204107.32492: variable 'omit' from source: magic vars 13830 1727204107.32841: variable 'ansible_distribution_major_version' from source: facts 13830 1727204107.32858: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204107.32867: variable 'omit' from source: magic vars 13830 1727204107.32886: variable 'omit' from source: magic vars 13830 1727204107.32984: variable 'controller_device' from source: play vars 13830 1727204107.33000: variable 'omit' from source: magic vars 13830 1727204107.33046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204107.33081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204107.33101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204107.33123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204107.33136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204107.33162: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204107.33166: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204107.33169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204107.33268: Set connection var ansible_connection to ssh 13830 1727204107.33278: Set connection var ansible_timeout to 10 13830 1727204107.33281: Set connection var ansible_shell_executable to /bin/sh 13830 1727204107.33284: Set connection var ansible_shell_type to sh 13830 1727204107.33291: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204107.33299: Set connection var ansible_pipelining to False 13830 1727204107.33321: variable 'ansible_shell_executable' from source: unknown 13830 1727204107.33324: variable 'ansible_connection' from source: unknown 13830 1727204107.33330: variable 'ansible_module_compression' from source: unknown 13830 1727204107.33335: variable 'ansible_shell_type' from source: unknown 13830 1727204107.33337: variable 'ansible_shell_executable' from source: unknown 13830 1727204107.33340: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204107.33342: variable 'ansible_pipelining' from source: unknown 13830 1727204107.33344: variable 'ansible_timeout' from source: unknown 13830 1727204107.33347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204107.33481: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204107.33493: variable 'omit' from source: magic vars 13830 1727204107.33498: starting attempt loop 13830 1727204107.33503: running the handler 13830 1727204107.33516: _low_level_execute_command(): starting 13830 1727204107.33524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204107.34282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.34294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.34306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.34324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.34362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.34371: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.34384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.34395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.34402: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.34410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.34420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.34435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.34446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.34451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.34458: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.34470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.34546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.34562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.34567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.34654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.36360: stdout chunk (state=3): >>>/root <<< 13830 1727204107.36528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.36561: stderr chunk (state=3): >>><<< 13830 1727204107.36567: stdout chunk (state=3): >>><<< 13830 1727204107.36689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.36692: _low_level_execute_command(): starting 13830 1727204107.36700: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962 `" && echo ansible-tmp-1727204107.3658912-16639-160685370751962="` echo /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962 `" ) && sleep 0' 13830 1727204107.37297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.37312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.37329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.37352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.37399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.37413: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.37428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.37450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.37466: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.37480: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.37493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.37507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.37523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.37542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.37554: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.37572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.37650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.37673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.37691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.37767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.39748: stdout chunk (state=3): >>>ansible-tmp-1727204107.3658912-16639-160685370751962=/root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962 <<< 13830 1727204107.39940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.39944: stdout chunk (state=3): >>><<< 13830 1727204107.39951: stderr chunk (state=3): >>><<< 13830 1727204107.39974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204107.3658912-16639-160685370751962=/root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.40006: variable 'ansible_module_compression' from source: unknown 13830 1727204107.40067: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204107.40105: variable 'ansible_facts' from source: unknown 13830 1727204107.40188: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962/AnsiballZ_command.py 13830 1727204107.40387: Sending initial data 13830 1727204107.40390: Sent initial data (156 bytes) 13830 1727204107.41446: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.41455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.41468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.41485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.41525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.41538: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.41548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.41562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.41572: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.41579: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.41587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.41597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.41608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.41616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.41623: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.41631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.41727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.41736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.41739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.41816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.43574: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204107.43610: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204107.43647: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp162twj6x /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962/AnsiballZ_command.py <<< 13830 1727204107.43685: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204107.44790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.44957: stderr chunk (state=3): >>><<< 13830 1727204107.44961: stdout chunk (state=3): >>><<< 13830 1727204107.44965: done transferring module to remote 13830 1727204107.44971: _low_level_execute_command(): starting 13830 1727204107.44973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962/ /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962/AnsiballZ_command.py && sleep 0' 13830 1727204107.45611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.45630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.45650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.45673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.45716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.45729: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.45751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.45772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.45785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.45797: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.45810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.45823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.45844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.45859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.45875: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.45890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.45969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.45987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.46002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.46084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.47800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.47895: stderr chunk (state=3): >>><<< 13830 1727204107.47900: stdout chunk (state=3): >>><<< 13830 1727204107.47927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.47930: _low_level_execute_command(): starting 13830 1727204107.47936: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962/AnsiballZ_command.py && sleep 0' 13830 1727204107.48678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.48682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.48684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.48686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.48720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.49039: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.49044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.49057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.49071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.49083: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.49096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.49110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.49125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.49140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.49150: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.49163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.49244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.49261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.49277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.49361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.63281: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:07.624220", "end": "2024-09-24 14:55:07.631679", "delta": "0:00:00.007459", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204107.64469: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. <<< 13830 1727204107.64557: stderr chunk (state=3): >>><<< 13830 1727204107.64561: stdout chunk (state=3): >>><<< 13830 1727204107.64671: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:07.624220", "end": "2024-09-24 14:55:07.631679", "delta": "0:00:00.007459", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. 13830 1727204107.64680: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204107.64683: _low_level_execute_command(): starting 13830 1727204107.64685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204107.3658912-16639-160685370751962/ > /dev/null 2>&1 && sleep 0' 13830 1727204107.66147: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.66786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.66804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.66825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.66878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.66892: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.66907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.66926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.66943: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.66957: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.66978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.66993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.67010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.67023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.67039: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.67055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.67134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.67152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.67169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.67251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.69240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.69245: stdout chunk (state=3): >>><<< 13830 1727204107.69248: stderr chunk (state=3): >>><<< 13830 1727204107.69703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.69707: handler run complete 13830 1727204107.69710: Evaluated conditional (False): False 13830 1727204107.69715: Evaluated conditional (False): False 13830 1727204107.69718: attempt loop complete, returning result 13830 1727204107.69720: _execute() done 13830 1727204107.69722: dumping result to json 13830 1727204107.69724: done dumping result, returning 13830 1727204107.69726: done running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' [0affcd87-79f5-1659-6b02-0000000006d8] 13830 1727204107.69728: sending task result for task 0affcd87-79f5-1659-6b02-0000000006d8 13830 1727204107.69808: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006d8 13830 1727204107.69812: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007459", "end": "2024-09-24 14:55:07.631679", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:55:07.624220" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 13830 1727204107.69879: no more pending results, returning what we have 13830 1727204107.69882: results queue empty 13830 1727204107.69883: checking for any_errors_fatal 13830 1727204107.69885: done checking for any_errors_fatal 13830 1727204107.69886: checking for max_fail_percentage 13830 1727204107.69887: done checking for max_fail_percentage 13830 1727204107.69888: checking to see if all hosts have failed and the running result is not ok 13830 1727204107.69889: done checking to see if all hosts have failed 13830 1727204107.69889: getting the remaining hosts for this loop 13830 1727204107.69891: done getting the remaining hosts for this loop 13830 1727204107.69894: getting the next task for host managed-node3 13830 1727204107.69903: done getting next task for host managed-node3 13830 1727204107.69906: ^ task is: TASK: Remove test interfaces 13830 1727204107.69910: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204107.69914: getting variables 13830 1727204107.69916: in VariableManager get_vars() 13830 1727204107.69951: Calling all_inventory to load vars for managed-node3 13830 1727204107.69954: Calling groups_inventory to load vars for managed-node3 13830 1727204107.69957: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204107.69968: Calling all_plugins_play to load vars for managed-node3 13830 1727204107.69970: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204107.69973: Calling groups_plugins_play to load vars for managed-node3 13830 1727204107.72429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204107.74292: done with get_vars() 13830 1727204107.74325: done getting variables 13830 1727204107.74396: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.429) 0:00:40.822 ***** 13830 1727204107.74434: entering _queue_task() for managed-node3/shell 13830 1727204107.75014: worker is 1 (out of 1 available) 13830 1727204107.75028: exiting _queue_task() for managed-node3/shell 13830 1727204107.75044: done queuing things up, now waiting for results queue to drain 13830 1727204107.75046: waiting for pending results... 13830 1727204107.75444: running TaskExecutor() for managed-node3/TASK: Remove test interfaces 13830 1727204107.75570: in run() - task 0affcd87-79f5-1659-6b02-0000000006de 13830 1727204107.75589: variable 'ansible_search_path' from source: unknown 13830 1727204107.75600: variable 'ansible_search_path' from source: unknown 13830 1727204107.75644: calling self._execute() 13830 1727204107.75757: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204107.75771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204107.75786: variable 'omit' from source: magic vars 13830 1727204107.76155: variable 'ansible_distribution_major_version' from source: facts 13830 1727204107.76174: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204107.76183: variable 'omit' from source: magic vars 13830 1727204107.76233: variable 'omit' from source: magic vars 13830 1727204107.76395: variable 'dhcp_interface1' from source: play vars 13830 1727204107.76406: variable 'dhcp_interface2' from source: play vars 13830 1727204107.76433: variable 'omit' from source: magic vars 13830 1727204107.76487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204107.76521: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204107.76550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204107.76574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204107.76588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204107.76620: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204107.76629: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204107.76641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204107.76756: Set connection var ansible_connection to ssh 13830 1727204107.76774: Set connection var ansible_timeout to 10 13830 1727204107.76785: Set connection var ansible_shell_executable to /bin/sh 13830 1727204107.76795: Set connection var ansible_shell_type to sh 13830 1727204107.76805: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204107.76819: Set connection var ansible_pipelining to False 13830 1727204107.76849: variable 'ansible_shell_executable' from source: unknown 13830 1727204107.76857: variable 'ansible_connection' from source: unknown 13830 1727204107.76867: variable 'ansible_module_compression' from source: unknown 13830 1727204107.76874: variable 'ansible_shell_type' from source: unknown 13830 1727204107.76881: variable 'ansible_shell_executable' from source: unknown 13830 1727204107.76887: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204107.76895: variable 'ansible_pipelining' from source: unknown 13830 1727204107.76904: variable 'ansible_timeout' from source: unknown 13830 1727204107.76912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204107.77092: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204107.77111: variable 'omit' from source: magic vars 13830 1727204107.77126: starting attempt loop 13830 1727204107.77136: running the handler 13830 1727204107.77152: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204107.77180: _low_level_execute_command(): starting 13830 1727204107.77192: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204107.78143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.78159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.78183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.78203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.78253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.78269: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.78285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.78303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.78317: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.78330: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.78346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.78361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.78383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.78396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.78408: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.78423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.78506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.78553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.78571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.78691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.80375: stdout chunk (state=3): >>>/root <<< 13830 1727204107.80561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.80566: stdout chunk (state=3): >>><<< 13830 1727204107.80569: stderr chunk (state=3): >>><<< 13830 1727204107.80686: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.80698: _low_level_execute_command(): starting 13830 1727204107.80702: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631 `" && echo ansible-tmp-1727204107.8059065-16663-275344794296631="` echo /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631 `" ) && sleep 0' 13830 1727204107.81785: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.81797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.81843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.81847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.81849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.82271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.82294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.82313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.82394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.84413: stdout chunk (state=3): >>>ansible-tmp-1727204107.8059065-16663-275344794296631=/root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631 <<< 13830 1727204107.84621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.84625: stdout chunk (state=3): >>><<< 13830 1727204107.84628: stderr chunk (state=3): >>><<< 13830 1727204107.84775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204107.8059065-16663-275344794296631=/root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.84779: variable 'ansible_module_compression' from source: unknown 13830 1727204107.84782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204107.84883: variable 'ansible_facts' from source: unknown 13830 1727204107.84906: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631/AnsiballZ_command.py 13830 1727204107.85190: Sending initial data 13830 1727204107.85193: Sent initial data (156 bytes) 13830 1727204107.86260: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.86276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.86289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.86304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.86355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.86371: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.86415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.86439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.86450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.86459: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.86471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.86482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.86495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.86504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.86512: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.86523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.86604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.86624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.86641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.86776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.88621: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204107.88655: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204107.88699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpb7_wvo2_ /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631/AnsiballZ_command.py <<< 13830 1727204107.88737: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204107.90029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.90193: stderr chunk (state=3): >>><<< 13830 1727204107.90197: stdout chunk (state=3): >>><<< 13830 1727204107.90199: done transferring module to remote 13830 1727204107.90202: _low_level_execute_command(): starting 13830 1727204107.90204: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631/ /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631/AnsiballZ_command.py && sleep 0' 13830 1727204107.91080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.91098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.91116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.91139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.91184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.91197: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.91213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.91238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.91251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.91263: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.91279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.91292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.91307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.91319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.91338: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.91354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.91428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.91669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.92119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.92193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204107.94111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204107.94114: stdout chunk (state=3): >>><<< 13830 1727204107.94117: stderr chunk (state=3): >>><<< 13830 1727204107.94216: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204107.94220: _low_level_execute_command(): starting 13830 1727204107.94222: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631/AnsiballZ_command.py && sleep 0' 13830 1727204107.94836: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204107.94853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.94871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.94894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.94942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.94955: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204107.94972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.94995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204107.95008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204107.95019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204107.95035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204107.95050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204107.95069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204107.95082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204107.95093: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204107.95111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204107.95189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204107.95216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204107.95238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204107.95323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.13529: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:08.086564", "end": "2024-09-24 14:55:08.133901", "delta": "0:00:00.047337", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204108.15003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204108.15204: stderr chunk (state=3): >>><<< 13830 1727204108.15208: stdout chunk (state=3): >>><<< 13830 1727204108.15355: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:08.086564", "end": "2024-09-24 14:55:08.133901", "delta": "0:00:00.047337", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204108.15368: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204108.15371: _low_level_execute_command(): starting 13830 1727204108.15374: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204107.8059065-16663-275344794296631/ > /dev/null 2>&1 && sleep 0' 13830 1727204108.15977: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204108.15991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.16005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.16025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.16073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204108.16086: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204108.16101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.16118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204108.16130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204108.16145: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204108.16158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.16174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.16190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.16202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204108.16213: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204108.16227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.16306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204108.16323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204108.16340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204108.16580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.18880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204108.18885: stdout chunk (state=3): >>><<< 13830 1727204108.18887: stderr chunk (state=3): >>><<< 13830 1727204108.18890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204108.18892: handler run complete 13830 1727204108.18894: Evaluated conditional (False): False 13830 1727204108.18896: attempt loop complete, returning result 13830 1727204108.18897: _execute() done 13830 1727204108.18899: dumping result to json 13830 1727204108.18901: done dumping result, returning 13830 1727204108.18908: done running TaskExecutor() for managed-node3/TASK: Remove test interfaces [0affcd87-79f5-1659-6b02-0000000006de] 13830 1727204108.18910: sending task result for task 0affcd87-79f5-1659-6b02-0000000006de 13830 1727204108.18992: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006de 13830 1727204108.18996: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.047337", "end": "2024-09-24 14:55:08.133901", "rc": 0, "start": "2024-09-24 14:55:08.086564" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 13830 1727204108.19076: no more pending results, returning what we have 13830 1727204108.19080: results queue empty 13830 1727204108.19081: checking for any_errors_fatal 13830 1727204108.19088: done checking for any_errors_fatal 13830 1727204108.19089: checking for max_fail_percentage 13830 1727204108.19091: done checking for max_fail_percentage 13830 1727204108.19091: checking to see if all hosts have failed and the running result is not ok 13830 1727204108.19092: done checking to see if all hosts have failed 13830 1727204108.19093: getting the remaining hosts for this loop 13830 1727204108.19095: done getting the remaining hosts for this loop 13830 1727204108.19098: getting the next task for host managed-node3 13830 1727204108.19104: done getting next task for host managed-node3 13830 1727204108.19107: ^ task is: TASK: Stop dnsmasq/radvd services 13830 1727204108.19111: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204108.19123: getting variables 13830 1727204108.19124: in VariableManager get_vars() 13830 1727204108.19159: Calling all_inventory to load vars for managed-node3 13830 1727204108.19162: Calling groups_inventory to load vars for managed-node3 13830 1727204108.19169: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204108.19179: Calling all_plugins_play to load vars for managed-node3 13830 1727204108.19182: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204108.19186: Calling groups_plugins_play to load vars for managed-node3 13830 1727204108.22959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204108.25409: done with get_vars() 13830 1727204108.25442: done getting variables 13830 1727204108.25514: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.511) 0:00:41.333 ***** 13830 1727204108.25548: entering _queue_task() for managed-node3/shell 13830 1727204108.25946: worker is 1 (out of 1 available) 13830 1727204108.25959: exiting _queue_task() for managed-node3/shell 13830 1727204108.25972: done queuing things up, now waiting for results queue to drain 13830 1727204108.25974: waiting for pending results... 13830 1727204108.26352: running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services 13830 1727204108.26473: in run() - task 0affcd87-79f5-1659-6b02-0000000006df 13830 1727204108.26554: variable 'ansible_search_path' from source: unknown 13830 1727204108.26558: variable 'ansible_search_path' from source: unknown 13830 1727204108.26597: calling self._execute() 13830 1727204108.26690: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.26699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.26708: variable 'omit' from source: magic vars 13830 1727204108.27406: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.27419: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.27424: variable 'omit' from source: magic vars 13830 1727204108.27480: variable 'omit' from source: magic vars 13830 1727204108.27515: variable 'omit' from source: magic vars 13830 1727204108.27560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204108.27602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.27622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204108.27639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.27651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.27686: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.27690: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.27692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.27800: Set connection var ansible_connection to ssh 13830 1727204108.27815: Set connection var ansible_timeout to 10 13830 1727204108.27821: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.27823: Set connection var ansible_shell_type to sh 13830 1727204108.27829: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.27839: Set connection var ansible_pipelining to False 13830 1727204108.27861: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.27866: variable 'ansible_connection' from source: unknown 13830 1727204108.27869: variable 'ansible_module_compression' from source: unknown 13830 1727204108.27871: variable 'ansible_shell_type' from source: unknown 13830 1727204108.27874: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.27876: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.27886: variable 'ansible_pipelining' from source: unknown 13830 1727204108.27888: variable 'ansible_timeout' from source: unknown 13830 1727204108.27892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.28042: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.28053: variable 'omit' from source: magic vars 13830 1727204108.28062: starting attempt loop 13830 1727204108.28069: running the handler 13830 1727204108.28072: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.28092: _low_level_execute_command(): starting 13830 1727204108.28104: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204108.29717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.29722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.29937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204108.29941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204108.30086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.30252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204108.30267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204108.30277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204108.30356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.32036: stdout chunk (state=3): >>>/root <<< 13830 1727204108.32176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204108.32374: stderr chunk (state=3): >>><<< 13830 1727204108.32378: stdout chunk (state=3): >>><<< 13830 1727204108.32383: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204108.32386: _low_level_execute_command(): starting 13830 1727204108.32390: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900 `" && echo ansible-tmp-1727204108.3227038-16780-262310546186900="` echo /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900 `" ) && sleep 0' 13830 1727204108.32989: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.32993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.33039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204108.33042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.33044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.33047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.33109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204108.33147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204108.33198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.35168: stdout chunk (state=3): >>>ansible-tmp-1727204108.3227038-16780-262310546186900=/root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900 <<< 13830 1727204108.35278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204108.35343: stderr chunk (state=3): >>><<< 13830 1727204108.35346: stdout chunk (state=3): >>><<< 13830 1727204108.35370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204108.3227038-16780-262310546186900=/root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204108.35391: variable 'ansible_module_compression' from source: unknown 13830 1727204108.35433: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204108.35466: variable 'ansible_facts' from source: unknown 13830 1727204108.35529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900/AnsiballZ_command.py 13830 1727204108.35640: Sending initial data 13830 1727204108.35643: Sent initial data (156 bytes) 13830 1727204108.36446: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204108.36454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.36466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.36478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.36546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204108.36549: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204108.36551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.36554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204108.36556: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204108.36558: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204108.36567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.36581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.36594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.36597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204108.36605: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204108.36655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.36782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204108.36785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204108.36787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204108.37114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.38881: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204108.38917: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204108.38957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpk7d5pqo1 /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900/AnsiballZ_command.py <<< 13830 1727204108.38993: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204108.39853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204108.40085: stderr chunk (state=3): >>><<< 13830 1727204108.40088: stdout chunk (state=3): >>><<< 13830 1727204108.40090: done transferring module to remote 13830 1727204108.40092: _low_level_execute_command(): starting 13830 1727204108.40094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900/ /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900/AnsiballZ_command.py && sleep 0' 13830 1727204108.40727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204108.40744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.40767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.40792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.40838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204108.40850: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204108.40876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.40897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204108.40912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204108.40923: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204108.40937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.40950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.40971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.40983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204108.40994: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204108.41013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.41097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204108.41105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204108.41112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204108.41186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.42885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204108.42942: stderr chunk (state=3): >>><<< 13830 1727204108.42945: stdout chunk (state=3): >>><<< 13830 1727204108.42954: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204108.42957: _low_level_execute_command(): starting 13830 1727204108.42962: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900/AnsiballZ_command.py && sleep 0' 13830 1727204108.43400: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204108.43404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.43434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204108.43445: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204108.43450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.43459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204108.43468: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204108.43481: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.43487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204108.43492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.43545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204108.43558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204108.43569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204108.43639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.58700: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:08.566456", "end": "2024-09-24 14:55:08.585823", "delta": "0:00:00.019367", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204108.59950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204108.60007: stderr chunk (state=3): >>><<< 13830 1727204108.60012: stdout chunk (state=3): >>><<< 13830 1727204108.60030: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:08.566456", "end": "2024-09-24 14:55:08.585823", "delta": "0:00:00.019367", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204108.60067: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204108.60075: _low_level_execute_command(): starting 13830 1727204108.60079: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204108.3227038-16780-262310546186900/ > /dev/null 2>&1 && sleep 0' 13830 1727204108.60548: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.60554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204108.60605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.60608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204108.60610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204108.60655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204108.60678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204108.60721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204108.62515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204108.62568: stderr chunk (state=3): >>><<< 13830 1727204108.62572: stdout chunk (state=3): >>><<< 13830 1727204108.62586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204108.62593: handler run complete 13830 1727204108.62613: Evaluated conditional (False): False 13830 1727204108.62620: attempt loop complete, returning result 13830 1727204108.62623: _execute() done 13830 1727204108.62625: dumping result to json 13830 1727204108.62630: done dumping result, returning 13830 1727204108.62640: done running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services [0affcd87-79f5-1659-6b02-0000000006df] 13830 1727204108.62645: sending task result for task 0affcd87-79f5-1659-6b02-0000000006df 13830 1727204108.62741: done sending task result for task 0affcd87-79f5-1659-6b02-0000000006df 13830 1727204108.62744: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.019367", "end": "2024-09-24 14:55:08.585823", "rc": 0, "start": "2024-09-24 14:55:08.566456" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 13830 1727204108.62806: no more pending results, returning what we have 13830 1727204108.62811: results queue empty 13830 1727204108.62811: checking for any_errors_fatal 13830 1727204108.62820: done checking for any_errors_fatal 13830 1727204108.62821: checking for max_fail_percentage 13830 1727204108.62822: done checking for max_fail_percentage 13830 1727204108.62823: checking to see if all hosts have failed and the running result is not ok 13830 1727204108.62824: done checking to see if all hosts have failed 13830 1727204108.62824: getting the remaining hosts for this loop 13830 1727204108.62826: done getting the remaining hosts for this loop 13830 1727204108.62830: getting the next task for host managed-node3 13830 1727204108.62840: done getting next task for host managed-node3 13830 1727204108.62842: ^ task is: TASK: Reset bond options to assert 13830 1727204108.62845: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204108.62848: getting variables 13830 1727204108.62850: in VariableManager get_vars() 13830 1727204108.62891: Calling all_inventory to load vars for managed-node3 13830 1727204108.62894: Calling groups_inventory to load vars for managed-node3 13830 1727204108.62897: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204108.62906: Calling all_plugins_play to load vars for managed-node3 13830 1727204108.62908: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204108.62911: Calling groups_plugins_play to load vars for managed-node3 13830 1727204108.63719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204108.64725: done with get_vars() 13830 1727204108.64744: done getting variables 13830 1727204108.64805: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reset bond options to assert] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:59 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.392) 0:00:41.726 ***** 13830 1727204108.64829: entering _queue_task() for managed-node3/set_fact 13830 1727204108.65054: worker is 1 (out of 1 available) 13830 1727204108.65070: exiting _queue_task() for managed-node3/set_fact 13830 1727204108.65083: done queuing things up, now waiting for results queue to drain 13830 1727204108.65085: waiting for pending results... 13830 1727204108.65271: running TaskExecutor() for managed-node3/TASK: Reset bond options to assert 13830 1727204108.65339: in run() - task 0affcd87-79f5-1659-6b02-00000000000f 13830 1727204108.65353: variable 'ansible_search_path' from source: unknown 13830 1727204108.65381: calling self._execute() 13830 1727204108.65453: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.65458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.65469: variable 'omit' from source: magic vars 13830 1727204108.65742: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.65755: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.65760: variable 'omit' from source: magic vars 13830 1727204108.65786: variable 'omit' from source: magic vars 13830 1727204108.65812: variable 'dhcp_interface1' from source: play vars 13830 1727204108.65863: variable 'dhcp_interface1' from source: play vars 13830 1727204108.65878: variable 'omit' from source: magic vars 13830 1727204108.65913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204108.65941: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.65957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204108.65975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.65982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.66006: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.66010: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.66012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.66082: Set connection var ansible_connection to ssh 13830 1727204108.66091: Set connection var ansible_timeout to 10 13830 1727204108.66096: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.66098: Set connection var ansible_shell_type to sh 13830 1727204108.66103: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.66111: Set connection var ansible_pipelining to False 13830 1727204108.66129: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.66132: variable 'ansible_connection' from source: unknown 13830 1727204108.66136: variable 'ansible_module_compression' from source: unknown 13830 1727204108.66138: variable 'ansible_shell_type' from source: unknown 13830 1727204108.66141: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.66144: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.66148: variable 'ansible_pipelining' from source: unknown 13830 1727204108.66151: variable 'ansible_timeout' from source: unknown 13830 1727204108.66155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.66258: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.66270: variable 'omit' from source: magic vars 13830 1727204108.66276: starting attempt loop 13830 1727204108.66279: running the handler 13830 1727204108.66290: handler run complete 13830 1727204108.66298: attempt loop complete, returning result 13830 1727204108.66301: _execute() done 13830 1727204108.66303: dumping result to json 13830 1727204108.66305: done dumping result, returning 13830 1727204108.66312: done running TaskExecutor() for managed-node3/TASK: Reset bond options to assert [0affcd87-79f5-1659-6b02-00000000000f] 13830 1727204108.66317: sending task result for task 0affcd87-79f5-1659-6b02-00000000000f 13830 1727204108.66411: done sending task result for task 0affcd87-79f5-1659-6b02-00000000000f 13830 1727204108.66414: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "bond_options_to_assert": [ { "key": "mode", "value": "active-backup" }, { "key": "arp_interval", "value": "60" }, { "key": "arp_ip_target", "value": "192.0.2.128" }, { "key": "arp_validate", "value": "none" }, { "key": "primary", "value": "test1" } ] }, "changed": false } 13830 1727204108.66488: no more pending results, returning what we have 13830 1727204108.66491: results queue empty 13830 1727204108.66492: checking for any_errors_fatal 13830 1727204108.66501: done checking for any_errors_fatal 13830 1727204108.66502: checking for max_fail_percentage 13830 1727204108.66504: done checking for max_fail_percentage 13830 1727204108.66505: checking to see if all hosts have failed and the running result is not ok 13830 1727204108.66505: done checking to see if all hosts have failed 13830 1727204108.66506: getting the remaining hosts for this loop 13830 1727204108.66508: done getting the remaining hosts for this loop 13830 1727204108.66512: getting the next task for host managed-node3 13830 1727204108.66519: done getting next task for host managed-node3 13830 1727204108.66522: ^ task is: TASK: Include the task 'run_test.yml' 13830 1727204108.66524: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204108.66529: getting variables 13830 1727204108.66531: in VariableManager get_vars() 13830 1727204108.66562: Calling all_inventory to load vars for managed-node3 13830 1727204108.66566: Calling groups_inventory to load vars for managed-node3 13830 1727204108.66568: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204108.66577: Calling all_plugins_play to load vars for managed-node3 13830 1727204108.66588: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204108.66592: Calling groups_plugins_play to load vars for managed-node3 13830 1727204108.70993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204108.71917: done with get_vars() 13830 1727204108.71936: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:72 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.071) 0:00:41.798 ***** 13830 1727204108.71995: entering _queue_task() for managed-node3/include_tasks 13830 1727204108.72248: worker is 1 (out of 1 available) 13830 1727204108.72267: exiting _queue_task() for managed-node3/include_tasks 13830 1727204108.72279: done queuing things up, now waiting for results queue to drain 13830 1727204108.72281: waiting for pending results... 13830 1727204108.72488: running TaskExecutor() for managed-node3/TASK: Include the task 'run_test.yml' 13830 1727204108.72562: in run() - task 0affcd87-79f5-1659-6b02-000000000011 13830 1727204108.72576: variable 'ansible_search_path' from source: unknown 13830 1727204108.72606: calling self._execute() 13830 1727204108.72684: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.72693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.72713: variable 'omit' from source: magic vars 13830 1727204108.73070: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.73076: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.73079: _execute() done 13830 1727204108.73082: dumping result to json 13830 1727204108.73085: done dumping result, returning 13830 1727204108.73087: done running TaskExecutor() for managed-node3/TASK: Include the task 'run_test.yml' [0affcd87-79f5-1659-6b02-000000000011] 13830 1727204108.73095: sending task result for task 0affcd87-79f5-1659-6b02-000000000011 13830 1727204108.73227: done sending task result for task 0affcd87-79f5-1659-6b02-000000000011 13830 1727204108.73230: WORKER PROCESS EXITING 13830 1727204108.73287: no more pending results, returning what we have 13830 1727204108.73297: in VariableManager get_vars() 13830 1727204108.73347: Calling all_inventory to load vars for managed-node3 13830 1727204108.73350: Calling groups_inventory to load vars for managed-node3 13830 1727204108.73352: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204108.73362: Calling all_plugins_play to load vars for managed-node3 13830 1727204108.73366: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204108.73368: Calling groups_plugins_play to load vars for managed-node3 13830 1727204108.74934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204108.77207: done with get_vars() 13830 1727204108.77238: variable 'ansible_search_path' from source: unknown 13830 1727204108.77255: we have included files to process 13830 1727204108.77256: generating all_blocks data 13830 1727204108.77261: done generating all_blocks data 13830 1727204108.77270: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 13830 1727204108.77271: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 13830 1727204108.77273: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 13830 1727204108.77749: in VariableManager get_vars() 13830 1727204108.77775: done with get_vars() 13830 1727204108.77830: in VariableManager get_vars() 13830 1727204108.77857: done with get_vars() 13830 1727204108.77903: in VariableManager get_vars() 13830 1727204108.77925: done with get_vars() 13830 1727204108.77987: in VariableManager get_vars() 13830 1727204108.78008: done with get_vars() 13830 1727204108.78051: in VariableManager get_vars() 13830 1727204108.78076: done with get_vars() 13830 1727204108.78506: in VariableManager get_vars() 13830 1727204108.78528: done with get_vars() 13830 1727204108.78542: done processing included file 13830 1727204108.78544: iterating over new_blocks loaded from include file 13830 1727204108.78545: in VariableManager get_vars() 13830 1727204108.78559: done with get_vars() 13830 1727204108.78561: filtering new block on tags 13830 1727204108.78671: done filtering new block on tags 13830 1727204108.78674: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed-node3 13830 1727204108.78680: extending task lists for all hosts with included blocks 13830 1727204108.78723: done extending task lists 13830 1727204108.78725: done processing included files 13830 1727204108.78726: results queue empty 13830 1727204108.78727: checking for any_errors_fatal 13830 1727204108.78731: done checking for any_errors_fatal 13830 1727204108.78732: checking for max_fail_percentage 13830 1727204108.78733: done checking for max_fail_percentage 13830 1727204108.78734: checking to see if all hosts have failed and the running result is not ok 13830 1727204108.78735: done checking to see if all hosts have failed 13830 1727204108.78735: getting the remaining hosts for this loop 13830 1727204108.78737: done getting the remaining hosts for this loop 13830 1727204108.78739: getting the next task for host managed-node3 13830 1727204108.78743: done getting next task for host managed-node3 13830 1727204108.78746: ^ task is: TASK: TEST: {{ lsr_description }} 13830 1727204108.78748: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204108.78751: getting variables 13830 1727204108.78751: in VariableManager get_vars() 13830 1727204108.78762: Calling all_inventory to load vars for managed-node3 13830 1727204108.78766: Calling groups_inventory to load vars for managed-node3 13830 1727204108.78768: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204108.78774: Calling all_plugins_play to load vars for managed-node3 13830 1727204108.78777: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204108.78779: Calling groups_plugins_play to load vars for managed-node3 13830 1727204108.80105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204108.81748: done with get_vars() 13830 1727204108.81786: done getting variables 13830 1727204108.81838: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204108.81969: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.100) 0:00:41.898 ***** 13830 1727204108.82031: entering _queue_task() for managed-node3/debug 13830 1727204108.82399: worker is 1 (out of 1 available) 13830 1727204108.82417: exiting _queue_task() for managed-node3/debug 13830 1727204108.82428: done queuing things up, now waiting for results queue to drain 13830 1727204108.82430: waiting for pending results... 13830 1727204108.82754: running TaskExecutor() for managed-node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 13830 1727204108.82893: in run() - task 0affcd87-79f5-1659-6b02-0000000008ea 13830 1727204108.82914: variable 'ansible_search_path' from source: unknown 13830 1727204108.82922: variable 'ansible_search_path' from source: unknown 13830 1727204108.82966: calling self._execute() 13830 1727204108.83075: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.83094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.83110: variable 'omit' from source: magic vars 13830 1727204108.83511: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.83538: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.83548: variable 'omit' from source: magic vars 13830 1727204108.83594: variable 'omit' from source: magic vars 13830 1727204108.83708: variable 'lsr_description' from source: include params 13830 1727204108.83734: variable 'omit' from source: magic vars 13830 1727204108.83792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204108.83835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.83872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204108.83895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.83912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.83947: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.83958: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.83973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.84083: Set connection var ansible_connection to ssh 13830 1727204108.84100: Set connection var ansible_timeout to 10 13830 1727204108.84111: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.84118: Set connection var ansible_shell_type to sh 13830 1727204108.84128: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.84142: Set connection var ansible_pipelining to False 13830 1727204108.84173: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.84190: variable 'ansible_connection' from source: unknown 13830 1727204108.84198: variable 'ansible_module_compression' from source: unknown 13830 1727204108.84205: variable 'ansible_shell_type' from source: unknown 13830 1727204108.84211: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.84217: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.84224: variable 'ansible_pipelining' from source: unknown 13830 1727204108.84230: variable 'ansible_timeout' from source: unknown 13830 1727204108.84236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.84391: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.84417: variable 'omit' from source: magic vars 13830 1727204108.84428: starting attempt loop 13830 1727204108.84435: running the handler 13830 1727204108.84488: handler run complete 13830 1727204108.84515: attempt loop complete, returning result 13830 1727204108.84523: _execute() done 13830 1727204108.84532: dumping result to json 13830 1727204108.84540: done dumping result, returning 13830 1727204108.84551: done running TaskExecutor() for managed-node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0affcd87-79f5-1659-6b02-0000000008ea] 13830 1727204108.84561: sending task result for task 0affcd87-79f5-1659-6b02-0000000008ea ok: [managed-node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 13830 1727204108.84743: no more pending results, returning what we have 13830 1727204108.84748: results queue empty 13830 1727204108.84748: checking for any_errors_fatal 13830 1727204108.84751: done checking for any_errors_fatal 13830 1727204108.84751: checking for max_fail_percentage 13830 1727204108.84754: done checking for max_fail_percentage 13830 1727204108.84755: checking to see if all hosts have failed and the running result is not ok 13830 1727204108.84756: done checking to see if all hosts have failed 13830 1727204108.84757: getting the remaining hosts for this loop 13830 1727204108.84758: done getting the remaining hosts for this loop 13830 1727204108.84763: getting the next task for host managed-node3 13830 1727204108.84773: done getting next task for host managed-node3 13830 1727204108.84777: ^ task is: TASK: Show item 13830 1727204108.84781: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204108.84785: getting variables 13830 1727204108.84787: in VariableManager get_vars() 13830 1727204108.84834: Calling all_inventory to load vars for managed-node3 13830 1727204108.84837: Calling groups_inventory to load vars for managed-node3 13830 1727204108.84840: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204108.84853: Calling all_plugins_play to load vars for managed-node3 13830 1727204108.84856: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204108.84859: Calling groups_plugins_play to load vars for managed-node3 13830 1727204108.85897: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008ea 13830 1727204108.85901: WORKER PROCESS EXITING 13830 1727204108.86888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204108.88707: done with get_vars() 13830 1727204108.88741: done getting variables 13830 1727204108.88813: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Tuesday 24 September 2024 14:55:08 -0400 (0:00:00.068) 0:00:41.966 ***** 13830 1727204108.88852: entering _queue_task() for managed-node3/debug 13830 1727204108.89214: worker is 1 (out of 1 available) 13830 1727204108.89228: exiting _queue_task() for managed-node3/debug 13830 1727204108.89240: done queuing things up, now waiting for results queue to drain 13830 1727204108.89242: waiting for pending results... 13830 1727204108.89611: running TaskExecutor() for managed-node3/TASK: Show item 13830 1727204108.89703: in run() - task 0affcd87-79f5-1659-6b02-0000000008eb 13830 1727204108.89718: variable 'ansible_search_path' from source: unknown 13830 1727204108.89721: variable 'ansible_search_path' from source: unknown 13830 1727204108.89780: variable 'omit' from source: magic vars 13830 1727204108.89984: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.89990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.89993: variable 'omit' from source: magic vars 13830 1727204108.90359: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.90375: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.90380: variable 'omit' from source: magic vars 13830 1727204108.90418: variable 'omit' from source: magic vars 13830 1727204108.90463: variable 'item' from source: unknown 13830 1727204108.90558: variable 'item' from source: unknown 13830 1727204108.90564: variable 'omit' from source: magic vars 13830 1727204108.90598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204108.90634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.90659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204108.90679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.90691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.90723: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.90727: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.90729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.90832: Set connection var ansible_connection to ssh 13830 1727204108.90846: Set connection var ansible_timeout to 10 13830 1727204108.90851: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.90853: Set connection var ansible_shell_type to sh 13830 1727204108.90859: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.90872: Set connection var ansible_pipelining to False 13830 1727204108.90892: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.90895: variable 'ansible_connection' from source: unknown 13830 1727204108.90900: variable 'ansible_module_compression' from source: unknown 13830 1727204108.90902: variable 'ansible_shell_type' from source: unknown 13830 1727204108.90905: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.90908: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.90910: variable 'ansible_pipelining' from source: unknown 13830 1727204108.90913: variable 'ansible_timeout' from source: unknown 13830 1727204108.90915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.91090: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.91094: variable 'omit' from source: magic vars 13830 1727204108.91096: starting attempt loop 13830 1727204108.91099: running the handler 13830 1727204108.91129: variable 'lsr_description' from source: include params 13830 1727204108.91201: variable 'lsr_description' from source: include params 13830 1727204108.91213: handler run complete 13830 1727204108.91230: attempt loop complete, returning result 13830 1727204108.91247: variable 'item' from source: unknown 13830 1727204108.91386: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 13830 1727204108.91524: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.91528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.91530: variable 'omit' from source: magic vars 13830 1727204108.91585: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.91598: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.91601: variable 'omit' from source: magic vars 13830 1727204108.91608: variable 'omit' from source: magic vars 13830 1727204108.91656: variable 'item' from source: unknown 13830 1727204108.91917: variable 'item' from source: unknown 13830 1727204108.91931: variable 'omit' from source: magic vars 13830 1727204108.91955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.91963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.91971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.91983: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.91986: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.91989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.92062: Set connection var ansible_connection to ssh 13830 1727204108.92073: Set connection var ansible_timeout to 10 13830 1727204108.92079: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.92081: Set connection var ansible_shell_type to sh 13830 1727204108.92086: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.92096: Set connection var ansible_pipelining to False 13830 1727204108.92115: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.92118: variable 'ansible_connection' from source: unknown 13830 1727204108.92121: variable 'ansible_module_compression' from source: unknown 13830 1727204108.92123: variable 'ansible_shell_type' from source: unknown 13830 1727204108.92126: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.92128: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.92130: variable 'ansible_pipelining' from source: unknown 13830 1727204108.92136: variable 'ansible_timeout' from source: unknown 13830 1727204108.92141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.92230: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.92245: variable 'omit' from source: magic vars 13830 1727204108.92248: starting attempt loop 13830 1727204108.92251: running the handler 13830 1727204108.92275: variable 'lsr_setup' from source: include params 13830 1727204108.92345: variable 'lsr_setup' from source: include params 13830 1727204108.92394: handler run complete 13830 1727204108.92407: attempt loop complete, returning result 13830 1727204108.92423: variable 'item' from source: unknown 13830 1727204108.92487: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 13830 1727204108.92591: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.92671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.92690: variable 'omit' from source: magic vars 13830 1727204108.92859: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.92872: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.92890: variable 'omit' from source: magic vars 13830 1727204108.92940: variable 'omit' from source: magic vars 13830 1727204108.92990: variable 'item' from source: unknown 13830 1727204108.93068: variable 'item' from source: unknown 13830 1727204108.93089: variable 'omit' from source: magic vars 13830 1727204108.93115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.93130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.93147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.93166: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.93175: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.93186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.93305: Set connection var ansible_connection to ssh 13830 1727204108.93319: Set connection var ansible_timeout to 10 13830 1727204108.93329: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.93337: Set connection var ansible_shell_type to sh 13830 1727204108.93350: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.93370: Set connection var ansible_pipelining to False 13830 1727204108.93394: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.93404: variable 'ansible_connection' from source: unknown 13830 1727204108.93413: variable 'ansible_module_compression' from source: unknown 13830 1727204108.93420: variable 'ansible_shell_type' from source: unknown 13830 1727204108.93426: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.93432: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.93440: variable 'ansible_pipelining' from source: unknown 13830 1727204108.93448: variable 'ansible_timeout' from source: unknown 13830 1727204108.93458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.93558: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.93579: variable 'omit' from source: magic vars 13830 1727204108.93588: starting attempt loop 13830 1727204108.93593: running the handler 13830 1727204108.93619: variable 'lsr_test' from source: include params 13830 1727204108.93693: variable 'lsr_test' from source: include params 13830 1727204108.93716: handler run complete 13830 1727204108.93734: attempt loop complete, returning result 13830 1727204108.93753: variable 'item' from source: unknown 13830 1727204108.93827: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile_reconfigure.yml" ] } 13830 1727204108.93993: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.94007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.94022: variable 'omit' from source: magic vars 13830 1727204108.94251: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.94270: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.94278: variable 'omit' from source: magic vars 13830 1727204108.94297: variable 'omit' from source: magic vars 13830 1727204108.94340: variable 'item' from source: unknown 13830 1727204108.94725: variable 'item' from source: unknown 13830 1727204108.94751: variable 'omit' from source: magic vars 13830 1727204108.94779: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.94799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.94814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.94831: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.94839: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.94847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.94932: Set connection var ansible_connection to ssh 13830 1727204108.94947: Set connection var ansible_timeout to 10 13830 1727204108.94957: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.94963: Set connection var ansible_shell_type to sh 13830 1727204108.94976: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.94989: Set connection var ansible_pipelining to False 13830 1727204108.95022: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.95030: variable 'ansible_connection' from source: unknown 13830 1727204108.95036: variable 'ansible_module_compression' from source: unknown 13830 1727204108.95043: variable 'ansible_shell_type' from source: unknown 13830 1727204108.95049: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.95055: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.95062: variable 'ansible_pipelining' from source: unknown 13830 1727204108.95070: variable 'ansible_timeout' from source: unknown 13830 1727204108.95077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.95182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.95195: variable 'omit' from source: magic vars 13830 1727204108.95204: starting attempt loop 13830 1727204108.95211: running the handler 13830 1727204108.95243: variable 'lsr_assert' from source: include params 13830 1727204108.95317: variable 'lsr_assert' from source: include params 13830 1727204108.95347: handler run complete 13830 1727204108.96102: attempt loop complete, returning result 13830 1727204108.96128: variable 'item' from source: unknown 13830 1727204108.96212: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_bond_options.yml" ] } 13830 1727204108.96506: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.96521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.96539: variable 'omit' from source: magic vars 13830 1727204108.96876: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.96894: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.96903: variable 'omit' from source: magic vars 13830 1727204108.96923: variable 'omit' from source: magic vars 13830 1727204108.96970: variable 'item' from source: unknown 13830 1727204108.97046: variable 'item' from source: unknown 13830 1727204108.97068: variable 'omit' from source: magic vars 13830 1727204108.97096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.97116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.97128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.97146: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.97154: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.97161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.97276: Set connection var ansible_connection to ssh 13830 1727204108.97290: Set connection var ansible_timeout to 10 13830 1727204108.97301: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.97308: Set connection var ansible_shell_type to sh 13830 1727204108.97319: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.97339: Set connection var ansible_pipelining to False 13830 1727204108.97363: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.97373: variable 'ansible_connection' from source: unknown 13830 1727204108.97381: variable 'ansible_module_compression' from source: unknown 13830 1727204108.97387: variable 'ansible_shell_type' from source: unknown 13830 1727204108.97393: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.97398: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.97404: variable 'ansible_pipelining' from source: unknown 13830 1727204108.97410: variable 'ansible_timeout' from source: unknown 13830 1727204108.97417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.97524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.97539: variable 'omit' from source: magic vars 13830 1727204108.97554: starting attempt loop 13830 1727204108.97560: running the handler 13830 1727204108.97685: handler run complete 13830 1727204108.97702: attempt loop complete, returning result 13830 1727204108.97721: variable 'item' from source: unknown 13830 1727204108.97801: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 13830 1727204108.97982: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.97994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.98007: variable 'omit' from source: magic vars 13830 1727204108.98182: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.98193: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.98201: variable 'omit' from source: magic vars 13830 1727204108.98219: variable 'omit' from source: magic vars 13830 1727204108.98271: variable 'item' from source: unknown 13830 1727204108.98337: variable 'item' from source: unknown 13830 1727204108.98366: variable 'omit' from source: magic vars 13830 1727204108.98391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.98404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.98416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.98432: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.98440: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.98448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.98532: Set connection var ansible_connection to ssh 13830 1727204108.98547: Set connection var ansible_timeout to 10 13830 1727204108.98557: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.98574: Set connection var ansible_shell_type to sh 13830 1727204108.98586: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.98600: Set connection var ansible_pipelining to False 13830 1727204108.98623: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.98631: variable 'ansible_connection' from source: unknown 13830 1727204108.98638: variable 'ansible_module_compression' from source: unknown 13830 1727204108.98645: variable 'ansible_shell_type' from source: unknown 13830 1727204108.98652: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.98658: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.98671: variable 'ansible_pipelining' from source: unknown 13830 1727204108.98682: variable 'ansible_timeout' from source: unknown 13830 1727204108.98691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.98796: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204108.98811: variable 'omit' from source: magic vars 13830 1727204108.98820: starting attempt loop 13830 1727204108.98827: running the handler 13830 1727204108.98851: variable 'lsr_fail_debug' from source: play vars 13830 1727204108.98928: variable 'lsr_fail_debug' from source: play vars 13830 1727204108.98950: handler run complete 13830 1727204108.98971: attempt loop complete, returning result 13830 1727204108.98989: variable 'item' from source: unknown 13830 1727204108.99061: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 13830 1727204108.99238: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.99253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.99274: variable 'omit' from source: magic vars 13830 1727204108.99441: variable 'ansible_distribution_major_version' from source: facts 13830 1727204108.99452: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204108.99461: variable 'omit' from source: magic vars 13830 1727204108.99483: variable 'omit' from source: magic vars 13830 1727204108.99535: variable 'item' from source: unknown 13830 1727204108.99604: variable 'item' from source: unknown 13830 1727204108.99629: variable 'omit' from source: magic vars 13830 1727204108.99653: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204108.99669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.99687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204108.99704: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204108.99717: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.99730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204108.99808: Set connection var ansible_connection to ssh 13830 1727204108.99822: Set connection var ansible_timeout to 10 13830 1727204108.99840: Set connection var ansible_shell_executable to /bin/sh 13830 1727204108.99848: Set connection var ansible_shell_type to sh 13830 1727204108.99857: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204108.99873: Set connection var ansible_pipelining to False 13830 1727204108.99896: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.99903: variable 'ansible_connection' from source: unknown 13830 1727204108.99910: variable 'ansible_module_compression' from source: unknown 13830 1727204108.99917: variable 'ansible_shell_type' from source: unknown 13830 1727204108.99924: variable 'ansible_shell_executable' from source: unknown 13830 1727204108.99930: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204108.99945: variable 'ansible_pipelining' from source: unknown 13830 1727204108.99952: variable 'ansible_timeout' from source: unknown 13830 1727204108.99960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.00066: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204109.00080: variable 'omit' from source: magic vars 13830 1727204109.00089: starting attempt loop 13830 1727204109.00096: running the handler 13830 1727204109.00120: variable 'lsr_cleanup' from source: include params 13830 1727204109.00194: variable 'lsr_cleanup' from source: include params 13830 1727204109.00219: handler run complete 13830 1727204109.00239: attempt loop complete, returning result 13830 1727204109.00258: variable 'item' from source: unknown 13830 1727204109.00330: variable 'item' from source: unknown ok: [managed-node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml", "tasks/check_network_dns.yml" ] } 13830 1727204109.00453: dumping result to json 13830 1727204109.00470: done dumping result, returning 13830 1727204109.00483: done running TaskExecutor() for managed-node3/TASK: Show item [0affcd87-79f5-1659-6b02-0000000008eb] 13830 1727204109.00494: sending task result for task 0affcd87-79f5-1659-6b02-0000000008eb 13830 1727204109.00637: no more pending results, returning what we have 13830 1727204109.00641: results queue empty 13830 1727204109.00642: checking for any_errors_fatal 13830 1727204109.00651: done checking for any_errors_fatal 13830 1727204109.00652: checking for max_fail_percentage 13830 1727204109.00654: done checking for max_fail_percentage 13830 1727204109.00655: checking to see if all hosts have failed and the running result is not ok 13830 1727204109.00656: done checking to see if all hosts have failed 13830 1727204109.00656: getting the remaining hosts for this loop 13830 1727204109.00659: done getting the remaining hosts for this loop 13830 1727204109.00663: getting the next task for host managed-node3 13830 1727204109.00673: done getting next task for host managed-node3 13830 1727204109.00676: ^ task is: TASK: Include the task 'show_interfaces.yml' 13830 1727204109.00679: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204109.00684: getting variables 13830 1727204109.00686: in VariableManager get_vars() 13830 1727204109.00729: Calling all_inventory to load vars for managed-node3 13830 1727204109.00732: Calling groups_inventory to load vars for managed-node3 13830 1727204109.00735: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.00747: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.00750: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.00753: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.01809: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008eb 13830 1727204109.01813: WORKER PROCESS EXITING 13830 1727204109.02850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.05139: done with get_vars() 13830 1727204109.05168: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.164) 0:00:42.130 ***** 13830 1727204109.05271: entering _queue_task() for managed-node3/include_tasks 13830 1727204109.05633: worker is 1 (out of 1 available) 13830 1727204109.05649: exiting _queue_task() for managed-node3/include_tasks 13830 1727204109.05663: done queuing things up, now waiting for results queue to drain 13830 1727204109.05666: waiting for pending results... 13830 1727204109.05961: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 13830 1727204109.06112: in run() - task 0affcd87-79f5-1659-6b02-0000000008ec 13830 1727204109.06133: variable 'ansible_search_path' from source: unknown 13830 1727204109.06142: variable 'ansible_search_path' from source: unknown 13830 1727204109.06187: calling self._execute() 13830 1727204109.06377: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.06388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.06402: variable 'omit' from source: magic vars 13830 1727204109.06818: variable 'ansible_distribution_major_version' from source: facts 13830 1727204109.06864: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204109.06928: _execute() done 13830 1727204109.06938: dumping result to json 13830 1727204109.06946: done dumping result, returning 13830 1727204109.06961: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-1659-6b02-0000000008ec] 13830 1727204109.06976: sending task result for task 0affcd87-79f5-1659-6b02-0000000008ec 13830 1727204109.07118: no more pending results, returning what we have 13830 1727204109.07124: in VariableManager get_vars() 13830 1727204109.07177: Calling all_inventory to load vars for managed-node3 13830 1727204109.07181: Calling groups_inventory to load vars for managed-node3 13830 1727204109.07183: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.07200: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.07203: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.07207: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.08352: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008ec 13830 1727204109.08355: WORKER PROCESS EXITING 13830 1727204109.09110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.11092: done with get_vars() 13830 1727204109.11120: variable 'ansible_search_path' from source: unknown 13830 1727204109.11122: variable 'ansible_search_path' from source: unknown 13830 1727204109.11170: we have included files to process 13830 1727204109.11172: generating all_blocks data 13830 1727204109.11177: done generating all_blocks data 13830 1727204109.11183: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 13830 1727204109.11184: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 13830 1727204109.11187: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 13830 1727204109.11307: in VariableManager get_vars() 13830 1727204109.11332: done with get_vars() 13830 1727204109.11638: done processing included file 13830 1727204109.11640: iterating over new_blocks loaded from include file 13830 1727204109.11642: in VariableManager get_vars() 13830 1727204109.11662: done with get_vars() 13830 1727204109.11666: filtering new block on tags 13830 1727204109.11703: done filtering new block on tags 13830 1727204109.11706: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 13830 1727204109.11775: extending task lists for all hosts with included blocks 13830 1727204109.12323: done extending task lists 13830 1727204109.12325: done processing included files 13830 1727204109.12325: results queue empty 13830 1727204109.12326: checking for any_errors_fatal 13830 1727204109.12334: done checking for any_errors_fatal 13830 1727204109.12335: checking for max_fail_percentage 13830 1727204109.12336: done checking for max_fail_percentage 13830 1727204109.12337: checking to see if all hosts have failed and the running result is not ok 13830 1727204109.12338: done checking to see if all hosts have failed 13830 1727204109.12339: getting the remaining hosts for this loop 13830 1727204109.12340: done getting the remaining hosts for this loop 13830 1727204109.12343: getting the next task for host managed-node3 13830 1727204109.12347: done getting next task for host managed-node3 13830 1727204109.12349: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 13830 1727204109.12353: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204109.12355: getting variables 13830 1727204109.12356: in VariableManager get_vars() 13830 1727204109.12387: Calling all_inventory to load vars for managed-node3 13830 1727204109.12390: Calling groups_inventory to load vars for managed-node3 13830 1727204109.12392: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.12397: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.12400: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.12403: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.14007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.16918: done with get_vars() 13830 1727204109.16950: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.118) 0:00:42.249 ***** 13830 1727204109.17142: entering _queue_task() for managed-node3/include_tasks 13830 1727204109.17621: worker is 1 (out of 1 available) 13830 1727204109.17633: exiting _queue_task() for managed-node3/include_tasks 13830 1727204109.17645: done queuing things up, now waiting for results queue to drain 13830 1727204109.17647: waiting for pending results... 13830 1727204109.18908: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 13830 1727204109.19022: in run() - task 0affcd87-79f5-1659-6b02-000000000913 13830 1727204109.19033: variable 'ansible_search_path' from source: unknown 13830 1727204109.19039: variable 'ansible_search_path' from source: unknown 13830 1727204109.19078: calling self._execute() 13830 1727204109.19183: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.19188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.19200: variable 'omit' from source: magic vars 13830 1727204109.19773: variable 'ansible_distribution_major_version' from source: facts 13830 1727204109.19791: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204109.19802: _execute() done 13830 1727204109.19810: dumping result to json 13830 1727204109.19818: done dumping result, returning 13830 1727204109.19827: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-1659-6b02-000000000913] 13830 1727204109.19846: sending task result for task 0affcd87-79f5-1659-6b02-000000000913 13830 1727204109.19990: no more pending results, returning what we have 13830 1727204109.19997: in VariableManager get_vars() 13830 1727204109.20049: Calling all_inventory to load vars for managed-node3 13830 1727204109.20053: Calling groups_inventory to load vars for managed-node3 13830 1727204109.20055: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.20075: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.20079: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.20082: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.21295: done sending task result for task 0affcd87-79f5-1659-6b02-000000000913 13830 1727204109.21299: WORKER PROCESS EXITING 13830 1727204109.22310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.25007: done with get_vars() 13830 1727204109.25035: variable 'ansible_search_path' from source: unknown 13830 1727204109.25036: variable 'ansible_search_path' from source: unknown 13830 1727204109.25083: we have included files to process 13830 1727204109.25085: generating all_blocks data 13830 1727204109.25087: done generating all_blocks data 13830 1727204109.25088: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 13830 1727204109.25090: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 13830 1727204109.25092: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 13830 1727204109.25379: done processing included file 13830 1727204109.25381: iterating over new_blocks loaded from include file 13830 1727204109.25383: in VariableManager get_vars() 13830 1727204109.25403: done with get_vars() 13830 1727204109.25405: filtering new block on tags 13830 1727204109.25441: done filtering new block on tags 13830 1727204109.25444: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 13830 1727204109.25449: extending task lists for all hosts with included blocks 13830 1727204109.25624: done extending task lists 13830 1727204109.25625: done processing included files 13830 1727204109.25626: results queue empty 13830 1727204109.25627: checking for any_errors_fatal 13830 1727204109.25631: done checking for any_errors_fatal 13830 1727204109.25631: checking for max_fail_percentage 13830 1727204109.25632: done checking for max_fail_percentage 13830 1727204109.25633: checking to see if all hosts have failed and the running result is not ok 13830 1727204109.25634: done checking to see if all hosts have failed 13830 1727204109.25635: getting the remaining hosts for this loop 13830 1727204109.25636: done getting the remaining hosts for this loop 13830 1727204109.25638: getting the next task for host managed-node3 13830 1727204109.25643: done getting next task for host managed-node3 13830 1727204109.25645: ^ task is: TASK: Gather current interface info 13830 1727204109.25649: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204109.25651: getting variables 13830 1727204109.25652: in VariableManager get_vars() 13830 1727204109.25665: Calling all_inventory to load vars for managed-node3 13830 1727204109.25668: Calling groups_inventory to load vars for managed-node3 13830 1727204109.25670: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.25675: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.25678: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.25680: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.27061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.29584: done with get_vars() 13830 1727204109.29624: done getting variables 13830 1727204109.29675: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.125) 0:00:42.375 ***** 13830 1727204109.29710: entering _queue_task() for managed-node3/command 13830 1727204109.30083: worker is 1 (out of 1 available) 13830 1727204109.30097: exiting _queue_task() for managed-node3/command 13830 1727204109.30110: done queuing things up, now waiting for results queue to drain 13830 1727204109.30112: waiting for pending results... 13830 1727204109.30423: running TaskExecutor() for managed-node3/TASK: Gather current interface info 13830 1727204109.30574: in run() - task 0affcd87-79f5-1659-6b02-00000000094e 13830 1727204109.30596: variable 'ansible_search_path' from source: unknown 13830 1727204109.30608: variable 'ansible_search_path' from source: unknown 13830 1727204109.30650: calling self._execute() 13830 1727204109.30759: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.30777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.30794: variable 'omit' from source: magic vars 13830 1727204109.31203: variable 'ansible_distribution_major_version' from source: facts 13830 1727204109.31225: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204109.31236: variable 'omit' from source: magic vars 13830 1727204109.31306: variable 'omit' from source: magic vars 13830 1727204109.31349: variable 'omit' from source: magic vars 13830 1727204109.31405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204109.31582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204109.31611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204109.31639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204109.31663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204109.31701: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204109.31710: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.31718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.31838: Set connection var ansible_connection to ssh 13830 1727204109.31855: Set connection var ansible_timeout to 10 13830 1727204109.31871: Set connection var ansible_shell_executable to /bin/sh 13830 1727204109.31879: Set connection var ansible_shell_type to sh 13830 1727204109.31890: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204109.31905: Set connection var ansible_pipelining to False 13830 1727204109.31932: variable 'ansible_shell_executable' from source: unknown 13830 1727204109.31941: variable 'ansible_connection' from source: unknown 13830 1727204109.31949: variable 'ansible_module_compression' from source: unknown 13830 1727204109.31955: variable 'ansible_shell_type' from source: unknown 13830 1727204109.31962: variable 'ansible_shell_executable' from source: unknown 13830 1727204109.31973: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.31982: variable 'ansible_pipelining' from source: unknown 13830 1727204109.31990: variable 'ansible_timeout' from source: unknown 13830 1727204109.31998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.32141: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204109.32281: variable 'omit' from source: magic vars 13830 1727204109.32293: starting attempt loop 13830 1727204109.32299: running the handler 13830 1727204109.32318: _low_level_execute_command(): starting 13830 1727204109.32330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204109.33489: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204109.33511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.33528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.33548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.33596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.33610: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204109.33626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.33648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204109.33660: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204109.33675: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204109.33689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.33705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.33722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.33738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.33750: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204109.33766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.33843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204109.33870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204109.33889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204109.33980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204109.35637: stdout chunk (state=3): >>>/root <<< 13830 1727204109.35785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204109.35849: stderr chunk (state=3): >>><<< 13830 1727204109.35852: stdout chunk (state=3): >>><<< 13830 1727204109.35975: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204109.35979: _low_level_execute_command(): starting 13830 1727204109.35983: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099 `" && echo ansible-tmp-1727204109.3587947-17037-246278655610099="` echo /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099 `" ) && sleep 0' 13830 1727204109.36838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204109.36852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.36871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.36891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.36934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.36948: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204109.36966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.36984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204109.36996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204109.37009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204109.37021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.37035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.37052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.37067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.37080: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204109.37094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.37170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204109.37194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204109.37213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204109.37292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204109.39142: stdout chunk (state=3): >>>ansible-tmp-1727204109.3587947-17037-246278655610099=/root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099 <<< 13830 1727204109.39344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204109.39348: stdout chunk (state=3): >>><<< 13830 1727204109.39350: stderr chunk (state=3): >>><<< 13830 1727204109.39471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.3587947-17037-246278655610099=/root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204109.39475: variable 'ansible_module_compression' from source: unknown 13830 1727204109.39478: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204109.39670: variable 'ansible_facts' from source: unknown 13830 1727204109.39674: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099/AnsiballZ_command.py 13830 1727204109.39918: Sending initial data 13830 1727204109.39922: Sent initial data (156 bytes) 13830 1727204109.41008: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204109.41026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.41043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.41062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.41107: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.41120: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204109.41134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.41152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204109.41167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204109.41179: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204109.41193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.41207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.41225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.41239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.41255: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204109.41274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.41348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204109.41374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204109.41391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204109.41461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204109.43172: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204109.43205: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204109.43245: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpmwfvyljv /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099/AnsiballZ_command.py <<< 13830 1727204109.43279: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204109.44619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204109.44872: stderr chunk (state=3): >>><<< 13830 1727204109.44876: stdout chunk (state=3): >>><<< 13830 1727204109.44974: done transferring module to remote 13830 1727204109.44981: _low_level_execute_command(): starting 13830 1727204109.44984: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099/ /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099/AnsiballZ_command.py && sleep 0' 13830 1727204109.45544: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204109.45560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.45577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.45593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.45633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.45644: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204109.45656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.45676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204109.45687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204109.45697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204109.45707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.45719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.45733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.45743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.45752: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204109.45771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.45844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204109.45868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204109.45884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204109.45952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204109.47689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204109.47760: stderr chunk (state=3): >>><<< 13830 1727204109.47766: stdout chunk (state=3): >>><<< 13830 1727204109.47859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204109.47865: _low_level_execute_command(): starting 13830 1727204109.47868: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099/AnsiballZ_command.py && sleep 0' 13830 1727204109.49660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.49667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.49708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.49718: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204109.49733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.49749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204109.49757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204109.49765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204109.49773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.49783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.49799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.49807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.49813: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204109.49823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.49904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204109.49921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204109.49927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204109.50007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204109.63607: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:09.631553", "end": "2024-09-24 14:55:09.635037", "delta": "0:00:00.003484", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204109.64998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204109.65060: stderr chunk (state=3): >>><<< 13830 1727204109.65075: stdout chunk (state=3): >>><<< 13830 1727204109.65086: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:09.631553", "end": "2024-09-24 14:55:09.635037", "delta": "0:00:00.003484", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204109.65128: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204109.65138: _low_level_execute_command(): starting 13830 1727204109.65144: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.3587947-17037-246278655610099/ > /dev/null 2>&1 && sleep 0' 13830 1727204109.66861: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204109.66931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.66945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.66959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.67001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.67080: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204109.67090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.67104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204109.67112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204109.67119: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204109.67127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204109.67146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204109.67158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204109.67169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204109.67181: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204109.67191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204109.67382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204109.67397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204109.67403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204109.67580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204109.69524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204109.69528: stdout chunk (state=3): >>><<< 13830 1727204109.69539: stderr chunk (state=3): >>><<< 13830 1727204109.69561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204109.69569: handler run complete 13830 1727204109.69595: Evaluated conditional (False): False 13830 1727204109.69605: attempt loop complete, returning result 13830 1727204109.69608: _execute() done 13830 1727204109.69611: dumping result to json 13830 1727204109.69616: done dumping result, returning 13830 1727204109.69625: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-1659-6b02-00000000094e] 13830 1727204109.69631: sending task result for task 0affcd87-79f5-1659-6b02-00000000094e 13830 1727204109.69749: done sending task result for task 0affcd87-79f5-1659-6b02-00000000094e 13830 1727204109.69752: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003484", "end": "2024-09-24 14:55:09.635037", "rc": 0, "start": "2024-09-24 14:55:09.631553" } STDOUT: bonding_masters eth0 lo 13830 1727204109.69948: no more pending results, returning what we have 13830 1727204109.69952: results queue empty 13830 1727204109.69953: checking for any_errors_fatal 13830 1727204109.69955: done checking for any_errors_fatal 13830 1727204109.69956: checking for max_fail_percentage 13830 1727204109.69957: done checking for max_fail_percentage 13830 1727204109.69958: checking to see if all hosts have failed and the running result is not ok 13830 1727204109.69959: done checking to see if all hosts have failed 13830 1727204109.69960: getting the remaining hosts for this loop 13830 1727204109.69962: done getting the remaining hosts for this loop 13830 1727204109.69968: getting the next task for host managed-node3 13830 1727204109.69975: done getting next task for host managed-node3 13830 1727204109.69977: ^ task is: TASK: Set current_interfaces 13830 1727204109.69984: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204109.69988: getting variables 13830 1727204109.69990: in VariableManager get_vars() 13830 1727204109.70035: Calling all_inventory to load vars for managed-node3 13830 1727204109.70038: Calling groups_inventory to load vars for managed-node3 13830 1727204109.70041: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.70053: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.70055: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.70067: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.72748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.76760: done with get_vars() 13830 1727204109.77022: done getting variables 13830 1727204109.77089: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.475) 0:00:42.850 ***** 13830 1727204109.77244: entering _queue_task() for managed-node3/set_fact 13830 1727204109.77830: worker is 1 (out of 1 available) 13830 1727204109.77845: exiting _queue_task() for managed-node3/set_fact 13830 1727204109.77857: done queuing things up, now waiting for results queue to drain 13830 1727204109.77858: waiting for pending results... 13830 1727204109.78750: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 13830 1727204109.79105: in run() - task 0affcd87-79f5-1659-6b02-00000000094f 13830 1727204109.79126: variable 'ansible_search_path' from source: unknown 13830 1727204109.79133: variable 'ansible_search_path' from source: unknown 13830 1727204109.79179: calling self._execute() 13830 1727204109.79427: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.79442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.79527: variable 'omit' from source: magic vars 13830 1727204109.80296: variable 'ansible_distribution_major_version' from source: facts 13830 1727204109.80327: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204109.80344: variable 'omit' from source: magic vars 13830 1727204109.80417: variable 'omit' from source: magic vars 13830 1727204109.80546: variable '_current_interfaces' from source: set_fact 13830 1727204109.80620: variable 'omit' from source: magic vars 13830 1727204109.80669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204109.80712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204109.80737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204109.80758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204109.80775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204109.80807: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204109.80820: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.80827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.80930: Set connection var ansible_connection to ssh 13830 1727204109.80946: Set connection var ansible_timeout to 10 13830 1727204109.80956: Set connection var ansible_shell_executable to /bin/sh 13830 1727204109.80962: Set connection var ansible_shell_type to sh 13830 1727204109.80975: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204109.80989: Set connection var ansible_pipelining to False 13830 1727204109.81015: variable 'ansible_shell_executable' from source: unknown 13830 1727204109.81022: variable 'ansible_connection' from source: unknown 13830 1727204109.81033: variable 'ansible_module_compression' from source: unknown 13830 1727204109.81040: variable 'ansible_shell_type' from source: unknown 13830 1727204109.81046: variable 'ansible_shell_executable' from source: unknown 13830 1727204109.81053: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.81060: variable 'ansible_pipelining' from source: unknown 13830 1727204109.81068: variable 'ansible_timeout' from source: unknown 13830 1727204109.81075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.81218: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204109.81234: variable 'omit' from source: magic vars 13830 1727204109.81245: starting attempt loop 13830 1727204109.81255: running the handler 13830 1727204109.81272: handler run complete 13830 1727204109.81285: attempt loop complete, returning result 13830 1727204109.81291: _execute() done 13830 1727204109.81298: dumping result to json 13830 1727204109.81305: done dumping result, returning 13830 1727204109.81315: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-1659-6b02-00000000094f] 13830 1727204109.81324: sending task result for task 0affcd87-79f5-1659-6b02-00000000094f ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 13830 1727204109.81489: no more pending results, returning what we have 13830 1727204109.81493: results queue empty 13830 1727204109.81493: checking for any_errors_fatal 13830 1727204109.81504: done checking for any_errors_fatal 13830 1727204109.81504: checking for max_fail_percentage 13830 1727204109.81506: done checking for max_fail_percentage 13830 1727204109.81507: checking to see if all hosts have failed and the running result is not ok 13830 1727204109.81507: done checking to see if all hosts have failed 13830 1727204109.81508: getting the remaining hosts for this loop 13830 1727204109.81510: done getting the remaining hosts for this loop 13830 1727204109.81514: getting the next task for host managed-node3 13830 1727204109.81522: done getting next task for host managed-node3 13830 1727204109.81525: ^ task is: TASK: Show current_interfaces 13830 1727204109.81530: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204109.81538: getting variables 13830 1727204109.81539: in VariableManager get_vars() 13830 1727204109.81579: Calling all_inventory to load vars for managed-node3 13830 1727204109.81581: Calling groups_inventory to load vars for managed-node3 13830 1727204109.81584: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.81591: done sending task result for task 0affcd87-79f5-1659-6b02-00000000094f 13830 1727204109.81595: WORKER PROCESS EXITING 13830 1727204109.81607: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.81610: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.81613: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.84103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.85977: done with get_vars() 13830 1727204109.86006: done getting variables 13830 1727204109.86099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.088) 0:00:42.939 ***** 13830 1727204109.86134: entering _queue_task() for managed-node3/debug 13830 1727204109.86469: worker is 1 (out of 1 available) 13830 1727204109.86483: exiting _queue_task() for managed-node3/debug 13830 1727204109.86498: done queuing things up, now waiting for results queue to drain 13830 1727204109.86501: waiting for pending results... 13830 1727204109.86796: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 13830 1727204109.86913: in run() - task 0affcd87-79f5-1659-6b02-000000000914 13830 1727204109.86926: variable 'ansible_search_path' from source: unknown 13830 1727204109.86935: variable 'ansible_search_path' from source: unknown 13830 1727204109.86978: calling self._execute() 13830 1727204109.87075: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.87080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.87091: variable 'omit' from source: magic vars 13830 1727204109.87462: variable 'ansible_distribution_major_version' from source: facts 13830 1727204109.87482: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204109.87487: variable 'omit' from source: magic vars 13830 1727204109.87538: variable 'omit' from source: magic vars 13830 1727204109.87643: variable 'current_interfaces' from source: set_fact 13830 1727204109.87671: variable 'omit' from source: magic vars 13830 1727204109.87783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204109.87817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204109.87820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204109.87942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204109.87946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204109.87949: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204109.87952: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.87954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.88011: Set connection var ansible_connection to ssh 13830 1727204109.88021: Set connection var ansible_timeout to 10 13830 1727204109.88027: Set connection var ansible_shell_executable to /bin/sh 13830 1727204109.88029: Set connection var ansible_shell_type to sh 13830 1727204109.88035: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204109.88053: Set connection var ansible_pipelining to False 13830 1727204109.88077: variable 'ansible_shell_executable' from source: unknown 13830 1727204109.88080: variable 'ansible_connection' from source: unknown 13830 1727204109.88083: variable 'ansible_module_compression' from source: unknown 13830 1727204109.88085: variable 'ansible_shell_type' from source: unknown 13830 1727204109.88088: variable 'ansible_shell_executable' from source: unknown 13830 1727204109.88090: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.88094: variable 'ansible_pipelining' from source: unknown 13830 1727204109.88097: variable 'ansible_timeout' from source: unknown 13830 1727204109.88101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.88238: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204109.88246: variable 'omit' from source: magic vars 13830 1727204109.88251: starting attempt loop 13830 1727204109.88291: running the handler 13830 1727204109.88337: handler run complete 13830 1727204109.88348: attempt loop complete, returning result 13830 1727204109.88351: _execute() done 13830 1727204109.88354: dumping result to json 13830 1727204109.88356: done dumping result, returning 13830 1727204109.88363: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-1659-6b02-000000000914] 13830 1727204109.88382: sending task result for task 0affcd87-79f5-1659-6b02-000000000914 13830 1727204109.88470: done sending task result for task 0affcd87-79f5-1659-6b02-000000000914 13830 1727204109.88473: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 13830 1727204109.88526: no more pending results, returning what we have 13830 1727204109.88530: results queue empty 13830 1727204109.88531: checking for any_errors_fatal 13830 1727204109.88539: done checking for any_errors_fatal 13830 1727204109.88540: checking for max_fail_percentage 13830 1727204109.88541: done checking for max_fail_percentage 13830 1727204109.88542: checking to see if all hosts have failed and the running result is not ok 13830 1727204109.88543: done checking to see if all hosts have failed 13830 1727204109.88544: getting the remaining hosts for this loop 13830 1727204109.88546: done getting the remaining hosts for this loop 13830 1727204109.88551: getting the next task for host managed-node3 13830 1727204109.88560: done getting next task for host managed-node3 13830 1727204109.88566: ^ task is: TASK: Setup 13830 1727204109.88570: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204109.88576: getting variables 13830 1727204109.88578: in VariableManager get_vars() 13830 1727204109.88619: Calling all_inventory to load vars for managed-node3 13830 1727204109.88622: Calling groups_inventory to load vars for managed-node3 13830 1727204109.88624: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.88636: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.88638: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.88642: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.90700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.92415: done with get_vars() 13830 1727204109.92445: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Tuesday 24 September 2024 14:55:09 -0400 (0:00:00.064) 0:00:43.003 ***** 13830 1727204109.92547: entering _queue_task() for managed-node3/include_tasks 13830 1727204109.92884: worker is 1 (out of 1 available) 13830 1727204109.92897: exiting _queue_task() for managed-node3/include_tasks 13830 1727204109.92910: done queuing things up, now waiting for results queue to drain 13830 1727204109.92911: waiting for pending results... 13830 1727204109.93207: running TaskExecutor() for managed-node3/TASK: Setup 13830 1727204109.93316: in run() - task 0affcd87-79f5-1659-6b02-0000000008ed 13830 1727204109.93330: variable 'ansible_search_path' from source: unknown 13830 1727204109.93336: variable 'ansible_search_path' from source: unknown 13830 1727204109.93380: variable 'lsr_setup' from source: include params 13830 1727204109.93583: variable 'lsr_setup' from source: include params 13830 1727204109.93655: variable 'omit' from source: magic vars 13830 1727204109.93795: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.93803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.93814: variable 'omit' from source: magic vars 13830 1727204109.94048: variable 'ansible_distribution_major_version' from source: facts 13830 1727204109.94062: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204109.94069: variable 'item' from source: unknown 13830 1727204109.94137: variable 'item' from source: unknown 13830 1727204109.94168: variable 'item' from source: unknown 13830 1727204109.94233: variable 'item' from source: unknown 13830 1727204109.94369: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204109.94374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204109.94377: variable 'omit' from source: magic vars 13830 1727204109.94504: variable 'ansible_distribution_major_version' from source: facts 13830 1727204109.94509: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204109.94514: variable 'item' from source: unknown 13830 1727204109.94576: variable 'item' from source: unknown 13830 1727204109.94610: variable 'item' from source: unknown 13830 1727204109.94670: variable 'item' from source: unknown 13830 1727204109.94741: dumping result to json 13830 1727204109.94745: done dumping result, returning 13830 1727204109.94748: done running TaskExecutor() for managed-node3/TASK: Setup [0affcd87-79f5-1659-6b02-0000000008ed] 13830 1727204109.94751: sending task result for task 0affcd87-79f5-1659-6b02-0000000008ed 13830 1727204109.94788: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008ed 13830 1727204109.94791: WORKER PROCESS EXITING 13830 1727204109.94819: no more pending results, returning what we have 13830 1727204109.94824: in VariableManager get_vars() 13830 1727204109.94871: Calling all_inventory to load vars for managed-node3 13830 1727204109.94874: Calling groups_inventory to load vars for managed-node3 13830 1727204109.94876: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204109.94890: Calling all_plugins_play to load vars for managed-node3 13830 1727204109.94894: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204109.94897: Calling groups_plugins_play to load vars for managed-node3 13830 1727204109.96518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204109.98243: done with get_vars() 13830 1727204109.98270: variable 'ansible_search_path' from source: unknown 13830 1727204109.98271: variable 'ansible_search_path' from source: unknown 13830 1727204109.98311: variable 'ansible_search_path' from source: unknown 13830 1727204109.98312: variable 'ansible_search_path' from source: unknown 13830 1727204109.98343: we have included files to process 13830 1727204109.98344: generating all_blocks data 13830 1727204109.98345: done generating all_blocks data 13830 1727204109.98351: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13830 1727204109.98352: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13830 1727204109.98354: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13830 1727204109.99317: done processing included file 13830 1727204109.99319: iterating over new_blocks loaded from include file 13830 1727204109.99321: in VariableManager get_vars() 13830 1727204109.99340: done with get_vars() 13830 1727204109.99342: filtering new block on tags 13830 1727204109.99397: done filtering new block on tags 13830 1727204109.99404: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed-node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 13830 1727204109.99410: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 13830 1727204109.99411: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 13830 1727204109.99414: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 13830 1727204109.99506: in VariableManager get_vars() 13830 1727204109.99533: done with get_vars() 13830 1727204109.99540: variable 'item' from source: include params 13830 1727204109.99650: variable 'item' from source: include params 13830 1727204109.99685: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13830 1727204109.99771: in VariableManager get_vars() 13830 1727204109.99794: done with get_vars() 13830 1727204109.99961: in VariableManager get_vars() 13830 1727204109.99983: done with get_vars() 13830 1727204109.99989: variable 'item' from source: include params 13830 1727204110.00051: variable 'item' from source: include params 13830 1727204110.00087: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13830 1727204110.00241: in VariableManager get_vars() 13830 1727204110.00263: done with get_vars() 13830 1727204110.00368: done processing included file 13830 1727204110.00370: iterating over new_blocks loaded from include file 13830 1727204110.00371: in VariableManager get_vars() 13830 1727204110.00392: done with get_vars() 13830 1727204110.00395: filtering new block on tags 13830 1727204110.00473: done filtering new block on tags 13830 1727204110.00476: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed-node3 => (item=tasks/assert_dhcp_device_present.yml) 13830 1727204110.00481: extending task lists for all hosts with included blocks 13830 1727204110.01102: done extending task lists 13830 1727204110.01103: done processing included files 13830 1727204110.01104: results queue empty 13830 1727204110.01105: checking for any_errors_fatal 13830 1727204110.01109: done checking for any_errors_fatal 13830 1727204110.01110: checking for max_fail_percentage 13830 1727204110.01111: done checking for max_fail_percentage 13830 1727204110.01112: checking to see if all hosts have failed and the running result is not ok 13830 1727204110.01113: done checking to see if all hosts have failed 13830 1727204110.01119: getting the remaining hosts for this loop 13830 1727204110.01120: done getting the remaining hosts for this loop 13830 1727204110.01123: getting the next task for host managed-node3 13830 1727204110.01127: done getting next task for host managed-node3 13830 1727204110.01129: ^ task is: TASK: Install dnsmasq 13830 1727204110.01132: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204110.01134: getting variables 13830 1727204110.01135: in VariableManager get_vars() 13830 1727204110.01152: Calling all_inventory to load vars for managed-node3 13830 1727204110.01154: Calling groups_inventory to load vars for managed-node3 13830 1727204110.01156: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204110.01162: Calling all_plugins_play to load vars for managed-node3 13830 1727204110.01166: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204110.01169: Calling groups_plugins_play to load vars for managed-node3 13830 1727204110.02016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204110.02935: done with get_vars() 13830 1727204110.02951: done getting variables 13830 1727204110.02985: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:55:10 -0400 (0:00:00.104) 0:00:43.108 ***** 13830 1727204110.03008: entering _queue_task() for managed-node3/package 13830 1727204110.03298: worker is 1 (out of 1 available) 13830 1727204110.03310: exiting _queue_task() for managed-node3/package 13830 1727204110.03321: done queuing things up, now waiting for results queue to drain 13830 1727204110.03322: waiting for pending results... 13830 1727204110.03613: running TaskExecutor() for managed-node3/TASK: Install dnsmasq 13830 1727204110.03718: in run() - task 0affcd87-79f5-1659-6b02-000000000974 13830 1727204110.03731: variable 'ansible_search_path' from source: unknown 13830 1727204110.03735: variable 'ansible_search_path' from source: unknown 13830 1727204110.03775: calling self._execute() 13830 1727204110.03869: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204110.03875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204110.03885: variable 'omit' from source: magic vars 13830 1727204110.04272: variable 'ansible_distribution_major_version' from source: facts 13830 1727204110.04286: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204110.04291: variable 'omit' from source: magic vars 13830 1727204110.04357: variable 'omit' from source: magic vars 13830 1727204110.04509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204110.06078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204110.06125: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204110.06156: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204110.06184: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204110.06204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204110.06290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204110.06309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204110.06333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204110.06971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204110.06975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204110.06977: variable '__network_is_ostree' from source: set_fact 13830 1727204110.06980: variable 'omit' from source: magic vars 13830 1727204110.06982: variable 'omit' from source: magic vars 13830 1727204110.06984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204110.06987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204110.06989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204110.06991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204110.06993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204110.06995: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204110.06997: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204110.06999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204110.07001: Set connection var ansible_connection to ssh 13830 1727204110.07003: Set connection var ansible_timeout to 10 13830 1727204110.07005: Set connection var ansible_shell_executable to /bin/sh 13830 1727204110.07007: Set connection var ansible_shell_type to sh 13830 1727204110.07009: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204110.07011: Set connection var ansible_pipelining to False 13830 1727204110.07013: variable 'ansible_shell_executable' from source: unknown 13830 1727204110.07015: variable 'ansible_connection' from source: unknown 13830 1727204110.07017: variable 'ansible_module_compression' from source: unknown 13830 1727204110.07019: variable 'ansible_shell_type' from source: unknown 13830 1727204110.07022: variable 'ansible_shell_executable' from source: unknown 13830 1727204110.07024: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204110.07027: variable 'ansible_pipelining' from source: unknown 13830 1727204110.07030: variable 'ansible_timeout' from source: unknown 13830 1727204110.07033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204110.07040: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204110.07043: variable 'omit' from source: magic vars 13830 1727204110.07045: starting attempt loop 13830 1727204110.07047: running the handler 13830 1727204110.07049: variable 'ansible_facts' from source: unknown 13830 1727204110.07051: variable 'ansible_facts' from source: unknown 13830 1727204110.07090: _low_level_execute_command(): starting 13830 1727204110.07098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204110.07899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204110.07911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.07927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.07947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.07992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.07999: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204110.08010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.08024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204110.08034: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204110.08047: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204110.08055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.08070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.08082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.08090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.08097: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204110.08107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.08193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204110.08222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204110.08225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204110.08305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204110.09985: stdout chunk (state=3): >>>/root <<< 13830 1727204110.10097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204110.10200: stderr chunk (state=3): >>><<< 13830 1727204110.10204: stdout chunk (state=3): >>><<< 13830 1727204110.10231: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204110.10246: _low_level_execute_command(): starting 13830 1727204110.10252: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966 `" && echo ansible-tmp-1727204110.1023018-17110-75366951268966="` echo /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966 `" ) && sleep 0' 13830 1727204110.10958: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204110.10968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.10979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.10992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.11032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.11042: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204110.11051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.11066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204110.11073: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204110.11080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204110.11087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.11097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.11108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.11113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.11120: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204110.11129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.11200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204110.11219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204110.11229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204110.11304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204110.13314: stdout chunk (state=3): >>>ansible-tmp-1727204110.1023018-17110-75366951268966=/root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966 <<< 13830 1727204110.13426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204110.13513: stderr chunk (state=3): >>><<< 13830 1727204110.13516: stdout chunk (state=3): >>><<< 13830 1727204110.13541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204110.1023018-17110-75366951268966=/root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204110.13579: variable 'ansible_module_compression' from source: unknown 13830 1727204110.13645: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13830 1727204110.13691: variable 'ansible_facts' from source: unknown 13830 1727204110.13797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966/AnsiballZ_dnf.py 13830 1727204110.13953: Sending initial data 13830 1727204110.13956: Sent initial data (151 bytes) 13830 1727204110.14945: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204110.14954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.14967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.14982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.15020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.15027: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204110.15040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.15054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204110.15061: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204110.15077: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204110.15085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.15094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.15106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.15115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.15118: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204110.15128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.15203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204110.15220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204110.15223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204110.15321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204110.17184: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204110.17219: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204110.17260: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpx4amrz8y /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966/AnsiballZ_dnf.py <<< 13830 1727204110.17297: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204110.19102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204110.19251: stderr chunk (state=3): >>><<< 13830 1727204110.19255: stdout chunk (state=3): >>><<< 13830 1727204110.19281: done transferring module to remote 13830 1727204110.19293: _low_level_execute_command(): starting 13830 1727204110.19296: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966/ /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966/AnsiballZ_dnf.py && sleep 0' 13830 1727204110.20742: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.20746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.20812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204110.20816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204110.20950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.20953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204110.20969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.21049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204110.21172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204110.21176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204110.21255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204110.23113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204110.23167: stderr chunk (state=3): >>><<< 13830 1727204110.23171: stdout chunk (state=3): >>><<< 13830 1727204110.23186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204110.23190: _low_level_execute_command(): starting 13830 1727204110.23194: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966/AnsiballZ_dnf.py && sleep 0' 13830 1727204110.23717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204110.23734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.23750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.23772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.23817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.23834: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204110.23849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.23870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204110.23883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204110.23900: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204110.23914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204110.23928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204110.23950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204110.23965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204110.23980: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204110.23996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204110.24080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204110.24102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204110.24123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204110.24215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204111.18759: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13830 1727204111.23101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204111.23138: stderr chunk (state=3): >>><<< 13830 1727204111.23142: stdout chunk (state=3): >>><<< 13830 1727204111.23278: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204111.23283: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204111.23286: _low_level_execute_command(): starting 13830 1727204111.23289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204110.1023018-17110-75366951268966/ > /dev/null 2>&1 && sleep 0' 13830 1727204111.23969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204111.23985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204111.24001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.24021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.24078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204111.24091: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204111.24107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.24126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204111.24149: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204111.24163: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204111.24180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204111.24195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.24212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.24224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204111.24239: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204111.24261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.24343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204111.24381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204111.24401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204111.24534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204111.26471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204111.26475: stdout chunk (state=3): >>><<< 13830 1727204111.26491: stderr chunk (state=3): >>><<< 13830 1727204111.26555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204111.26558: handler run complete 13830 1727204111.26708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204111.26843: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204111.26873: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204111.26898: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204111.26924: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204111.26981: variable '__install_status' from source: set_fact 13830 1727204111.26995: Evaluated conditional (__install_status is success): True 13830 1727204111.27007: attempt loop complete, returning result 13830 1727204111.27010: _execute() done 13830 1727204111.27012: dumping result to json 13830 1727204111.27017: done dumping result, returning 13830 1727204111.27028: done running TaskExecutor() for managed-node3/TASK: Install dnsmasq [0affcd87-79f5-1659-6b02-000000000974] 13830 1727204111.27038: sending task result for task 0affcd87-79f5-1659-6b02-000000000974 13830 1727204111.27133: done sending task result for task 0affcd87-79f5-1659-6b02-000000000974 13830 1727204111.27136: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13830 1727204111.27217: no more pending results, returning what we have 13830 1727204111.27222: results queue empty 13830 1727204111.27223: checking for any_errors_fatal 13830 1727204111.27224: done checking for any_errors_fatal 13830 1727204111.27225: checking for max_fail_percentage 13830 1727204111.27227: done checking for max_fail_percentage 13830 1727204111.27228: checking to see if all hosts have failed and the running result is not ok 13830 1727204111.27228: done checking to see if all hosts have failed 13830 1727204111.27229: getting the remaining hosts for this loop 13830 1727204111.27233: done getting the remaining hosts for this loop 13830 1727204111.27237: getting the next task for host managed-node3 13830 1727204111.27242: done getting next task for host managed-node3 13830 1727204111.27245: ^ task is: TASK: Install pgrep, sysctl 13830 1727204111.27248: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204111.27251: getting variables 13830 1727204111.27253: in VariableManager get_vars() 13830 1727204111.27294: Calling all_inventory to load vars for managed-node3 13830 1727204111.27297: Calling groups_inventory to load vars for managed-node3 13830 1727204111.27299: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204111.27309: Calling all_plugins_play to load vars for managed-node3 13830 1727204111.27312: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204111.27314: Calling groups_plugins_play to load vars for managed-node3 13830 1727204111.28297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204111.29854: done with get_vars() 13830 1727204111.29894: done getting variables 13830 1727204111.29960: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:55:11 -0400 (0:00:01.269) 0:00:44.378 ***** 13830 1727204111.29996: entering _queue_task() for managed-node3/package 13830 1727204111.30352: worker is 1 (out of 1 available) 13830 1727204111.30367: exiting _queue_task() for managed-node3/package 13830 1727204111.30387: done queuing things up, now waiting for results queue to drain 13830 1727204111.30389: waiting for pending results... 13830 1727204111.30710: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 13830 1727204111.30817: in run() - task 0affcd87-79f5-1659-6b02-000000000975 13830 1727204111.30837: variable 'ansible_search_path' from source: unknown 13830 1727204111.30842: variable 'ansible_search_path' from source: unknown 13830 1727204111.30884: calling self._execute() 13830 1727204111.30989: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204111.30996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204111.31005: variable 'omit' from source: magic vars 13830 1727204111.31399: variable 'ansible_distribution_major_version' from source: facts 13830 1727204111.31485: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204111.31671: variable 'ansible_os_family' from source: facts 13830 1727204111.31675: Evaluated conditional (ansible_os_family == 'RedHat'): True 13830 1727204111.31745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204111.32051: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204111.32103: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204111.32137: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204111.32178: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204111.32250: variable 'ansible_distribution_major_version' from source: facts 13830 1727204111.32272: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 13830 1727204111.32275: when evaluation is False, skipping this task 13830 1727204111.32278: _execute() done 13830 1727204111.32280: dumping result to json 13830 1727204111.32284: done dumping result, returning 13830 1727204111.32292: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [0affcd87-79f5-1659-6b02-000000000975] 13830 1727204111.32298: sending task result for task 0affcd87-79f5-1659-6b02-000000000975 13830 1727204111.32402: done sending task result for task 0affcd87-79f5-1659-6b02-000000000975 13830 1727204111.32407: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 13830 1727204111.32456: no more pending results, returning what we have 13830 1727204111.32460: results queue empty 13830 1727204111.32461: checking for any_errors_fatal 13830 1727204111.32476: done checking for any_errors_fatal 13830 1727204111.32477: checking for max_fail_percentage 13830 1727204111.32478: done checking for max_fail_percentage 13830 1727204111.32480: checking to see if all hosts have failed and the running result is not ok 13830 1727204111.32481: done checking to see if all hosts have failed 13830 1727204111.32482: getting the remaining hosts for this loop 13830 1727204111.32484: done getting the remaining hosts for this loop 13830 1727204111.32489: getting the next task for host managed-node3 13830 1727204111.32497: done getting next task for host managed-node3 13830 1727204111.32500: ^ task is: TASK: Install pgrep, sysctl 13830 1727204111.32506: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204111.32511: getting variables 13830 1727204111.32512: in VariableManager get_vars() 13830 1727204111.32562: Calling all_inventory to load vars for managed-node3 13830 1727204111.32567: Calling groups_inventory to load vars for managed-node3 13830 1727204111.32569: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204111.32581: Calling all_plugins_play to load vars for managed-node3 13830 1727204111.32584: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204111.32587: Calling groups_plugins_play to load vars for managed-node3 13830 1727204111.34228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204111.35962: done with get_vars() 13830 1727204111.35996: done getting variables 13830 1727204111.36071: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.061) 0:00:44.439 ***** 13830 1727204111.36107: entering _queue_task() for managed-node3/package 13830 1727204111.36482: worker is 1 (out of 1 available) 13830 1727204111.36495: exiting _queue_task() for managed-node3/package 13830 1727204111.36507: done queuing things up, now waiting for results queue to drain 13830 1727204111.36509: waiting for pending results... 13830 1727204111.36821: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 13830 1727204111.36939: in run() - task 0affcd87-79f5-1659-6b02-000000000976 13830 1727204111.36954: variable 'ansible_search_path' from source: unknown 13830 1727204111.36958: variable 'ansible_search_path' from source: unknown 13830 1727204111.36993: calling self._execute() 13830 1727204111.37089: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204111.37092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204111.37108: variable 'omit' from source: magic vars 13830 1727204111.37499: variable 'ansible_distribution_major_version' from source: facts 13830 1727204111.37512: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204111.37654: variable 'ansible_os_family' from source: facts 13830 1727204111.37659: Evaluated conditional (ansible_os_family == 'RedHat'): True 13830 1727204111.37800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204111.38004: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204111.38039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204111.38067: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204111.38093: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204111.38147: variable 'ansible_distribution_major_version' from source: facts 13830 1727204111.38157: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 13830 1727204111.38163: variable 'omit' from source: magic vars 13830 1727204111.38197: variable 'omit' from source: magic vars 13830 1727204111.38298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204111.40285: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204111.40342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204111.40394: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204111.40423: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204111.40456: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204111.40540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204111.40570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204111.40594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204111.40637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204111.40648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204111.40743: variable '__network_is_ostree' from source: set_fact 13830 1727204111.40747: variable 'omit' from source: magic vars 13830 1727204111.40782: variable 'omit' from source: magic vars 13830 1727204111.40806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204111.40843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204111.40847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204111.40868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204111.40885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204111.40922: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204111.40926: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204111.40929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204111.41001: Set connection var ansible_connection to ssh 13830 1727204111.41009: Set connection var ansible_timeout to 10 13830 1727204111.41014: Set connection var ansible_shell_executable to /bin/sh 13830 1727204111.41016: Set connection var ansible_shell_type to sh 13830 1727204111.41021: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204111.41036: Set connection var ansible_pipelining to False 13830 1727204111.41059: variable 'ansible_shell_executable' from source: unknown 13830 1727204111.41062: variable 'ansible_connection' from source: unknown 13830 1727204111.41065: variable 'ansible_module_compression' from source: unknown 13830 1727204111.41068: variable 'ansible_shell_type' from source: unknown 13830 1727204111.41071: variable 'ansible_shell_executable' from source: unknown 13830 1727204111.41073: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204111.41075: variable 'ansible_pipelining' from source: unknown 13830 1727204111.41077: variable 'ansible_timeout' from source: unknown 13830 1727204111.41082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204111.41150: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204111.41158: variable 'omit' from source: magic vars 13830 1727204111.41163: starting attempt loop 13830 1727204111.41168: running the handler 13830 1727204111.41173: variable 'ansible_facts' from source: unknown 13830 1727204111.41176: variable 'ansible_facts' from source: unknown 13830 1727204111.41205: _low_level_execute_command(): starting 13830 1727204111.41211: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204111.41720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.41753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.41779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.41835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204111.41843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204111.41903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204111.43581: stdout chunk (state=3): >>>/root <<< 13830 1727204111.43739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204111.43771: stderr chunk (state=3): >>><<< 13830 1727204111.43774: stdout chunk (state=3): >>><<< 13830 1727204111.43886: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204111.43890: _low_level_execute_command(): starting 13830 1727204111.43894: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242 `" && echo ansible-tmp-1727204111.4379528-17159-257833452104242="` echo /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242 `" ) && sleep 0' 13830 1727204111.44538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204111.44558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.44562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.44601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.44605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.44607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.44660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204111.44681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204111.44741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204111.46644: stdout chunk (state=3): >>>ansible-tmp-1727204111.4379528-17159-257833452104242=/root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242 <<< 13830 1727204111.46771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204111.46880: stderr chunk (state=3): >>><<< 13830 1727204111.46893: stdout chunk (state=3): >>><<< 13830 1727204111.47086: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204111.4379528-17159-257833452104242=/root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204111.47090: variable 'ansible_module_compression' from source: unknown 13830 1727204111.47093: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13830 1727204111.47095: variable 'ansible_facts' from source: unknown 13830 1727204111.47186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242/AnsiballZ_dnf.py 13830 1727204111.47404: Sending initial data 13830 1727204111.47410: Sent initial data (152 bytes) 13830 1727204111.48445: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204111.48470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204111.48474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.48501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.48548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204111.48553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204111.48561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204111.48568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.48579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.48584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.48646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204111.48659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204111.48706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204111.50396: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13830 1727204111.50407: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204111.50428: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204111.50475: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmptn1ikfj8 /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242/AnsiballZ_dnf.py <<< 13830 1727204111.50516: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204111.51772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204111.51937: stderr chunk (state=3): >>><<< 13830 1727204111.51940: stdout chunk (state=3): >>><<< 13830 1727204111.51942: done transferring module to remote 13830 1727204111.51944: _low_level_execute_command(): starting 13830 1727204111.51946: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242/ /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242/AnsiballZ_dnf.py && sleep 0' 13830 1727204111.52377: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.52380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.52398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204111.52404: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.52414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204111.52420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.52441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204111.52444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.52505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204111.52507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204111.52543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204111.54224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204111.54283: stderr chunk (state=3): >>><<< 13830 1727204111.54286: stdout chunk (state=3): >>><<< 13830 1727204111.54301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204111.54305: _low_level_execute_command(): starting 13830 1727204111.54309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242/AnsiballZ_dnf.py && sleep 0' 13830 1727204111.54774: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204111.54779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204111.54827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.54831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204111.54833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204111.54885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204111.54903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204111.54906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204111.54959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204112.49711: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13830 1727204112.54283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204112.54346: stderr chunk (state=3): >>><<< 13830 1727204112.54352: stdout chunk (state=3): >>><<< 13830 1727204112.54367: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204112.54403: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204112.54409: _low_level_execute_command(): starting 13830 1727204112.54415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204111.4379528-17159-257833452104242/ > /dev/null 2>&1 && sleep 0' 13830 1727204112.54897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.54904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.54940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.54953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.55005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204112.55021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204112.55073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204112.56914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204112.56972: stderr chunk (state=3): >>><<< 13830 1727204112.56976: stdout chunk (state=3): >>><<< 13830 1727204112.56990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204112.56997: handler run complete 13830 1727204112.57025: attempt loop complete, returning result 13830 1727204112.57028: _execute() done 13830 1727204112.57030: dumping result to json 13830 1727204112.57037: done dumping result, returning 13830 1727204112.57048: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [0affcd87-79f5-1659-6b02-000000000976] 13830 1727204112.57052: sending task result for task 0affcd87-79f5-1659-6b02-000000000976 13830 1727204112.57159: done sending task result for task 0affcd87-79f5-1659-6b02-000000000976 13830 1727204112.57163: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13830 1727204112.57237: no more pending results, returning what we have 13830 1727204112.57240: results queue empty 13830 1727204112.57241: checking for any_errors_fatal 13830 1727204112.57248: done checking for any_errors_fatal 13830 1727204112.57248: checking for max_fail_percentage 13830 1727204112.57250: done checking for max_fail_percentage 13830 1727204112.57251: checking to see if all hosts have failed and the running result is not ok 13830 1727204112.57251: done checking to see if all hosts have failed 13830 1727204112.57252: getting the remaining hosts for this loop 13830 1727204112.57254: done getting the remaining hosts for this loop 13830 1727204112.57258: getting the next task for host managed-node3 13830 1727204112.57267: done getting next task for host managed-node3 13830 1727204112.57269: ^ task is: TASK: Create test interfaces 13830 1727204112.57274: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204112.57278: getting variables 13830 1727204112.57280: in VariableManager get_vars() 13830 1727204112.57320: Calling all_inventory to load vars for managed-node3 13830 1727204112.57323: Calling groups_inventory to load vars for managed-node3 13830 1727204112.57325: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204112.57338: Calling all_plugins_play to load vars for managed-node3 13830 1727204112.57341: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204112.57344: Calling groups_plugins_play to load vars for managed-node3 13830 1727204112.58313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204112.59230: done with get_vars() 13830 1727204112.59249: done getting variables 13830 1727204112.59298: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:55:12 -0400 (0:00:01.232) 0:00:45.671 ***** 13830 1727204112.59322: entering _queue_task() for managed-node3/shell 13830 1727204112.59575: worker is 1 (out of 1 available) 13830 1727204112.59589: exiting _queue_task() for managed-node3/shell 13830 1727204112.59600: done queuing things up, now waiting for results queue to drain 13830 1727204112.59601: waiting for pending results... 13830 1727204112.59790: running TaskExecutor() for managed-node3/TASK: Create test interfaces 13830 1727204112.59891: in run() - task 0affcd87-79f5-1659-6b02-000000000977 13830 1727204112.59902: variable 'ansible_search_path' from source: unknown 13830 1727204112.59905: variable 'ansible_search_path' from source: unknown 13830 1727204112.59939: calling self._execute() 13830 1727204112.60013: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204112.60018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204112.60027: variable 'omit' from source: magic vars 13830 1727204112.60308: variable 'ansible_distribution_major_version' from source: facts 13830 1727204112.60317: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204112.60322: variable 'omit' from source: magic vars 13830 1727204112.60361: variable 'omit' from source: magic vars 13830 1727204112.60610: variable 'dhcp_interface1' from source: play vars 13830 1727204112.60614: variable 'dhcp_interface2' from source: play vars 13830 1727204112.60631: variable 'omit' from source: magic vars 13830 1727204112.60669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204112.60698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204112.60715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204112.60728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204112.60742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204112.60767: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204112.60770: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204112.60773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204112.60849: Set connection var ansible_connection to ssh 13830 1727204112.60855: Set connection var ansible_timeout to 10 13830 1727204112.60860: Set connection var ansible_shell_executable to /bin/sh 13830 1727204112.60863: Set connection var ansible_shell_type to sh 13830 1727204112.60869: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204112.60878: Set connection var ansible_pipelining to False 13830 1727204112.60895: variable 'ansible_shell_executable' from source: unknown 13830 1727204112.60899: variable 'ansible_connection' from source: unknown 13830 1727204112.60902: variable 'ansible_module_compression' from source: unknown 13830 1727204112.60904: variable 'ansible_shell_type' from source: unknown 13830 1727204112.60906: variable 'ansible_shell_executable' from source: unknown 13830 1727204112.60910: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204112.60912: variable 'ansible_pipelining' from source: unknown 13830 1727204112.60914: variable 'ansible_timeout' from source: unknown 13830 1727204112.60917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204112.61029: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204112.61043: variable 'omit' from source: magic vars 13830 1727204112.61048: starting attempt loop 13830 1727204112.61051: running the handler 13830 1727204112.61061: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204112.61078: _low_level_execute_command(): starting 13830 1727204112.61084: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204112.61894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204112.61911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.61931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.61955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.62005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.62022: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204112.62041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.62061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204112.62080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204112.62090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204112.62100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.62114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.62130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.62151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.62166: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204112.62188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.62267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204112.62271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204112.62320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204112.64017: stdout chunk (state=3): >>>/root <<< 13830 1727204112.64107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204112.64193: stderr chunk (state=3): >>><<< 13830 1727204112.64207: stdout chunk (state=3): >>><<< 13830 1727204112.64333: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204112.64344: _low_level_execute_command(): starting 13830 1727204112.64347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611 `" && echo ansible-tmp-1727204112.642422-17204-69873533515611="` echo /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611 `" ) && sleep 0' 13830 1727204112.64950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204112.64967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.64984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.65001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.65042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.65053: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204112.65068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.65084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204112.65098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204112.65109: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204112.65122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.65140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.65158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.65173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.65185: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204112.65202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.65284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204112.65312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204112.65331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204112.65411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204112.67250: stdout chunk (state=3): >>>ansible-tmp-1727204112.642422-17204-69873533515611=/root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611 <<< 13830 1727204112.67384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204112.67434: stderr chunk (state=3): >>><<< 13830 1727204112.67438: stdout chunk (state=3): >>><<< 13830 1727204112.67553: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204112.642422-17204-69873533515611=/root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204112.67556: variable 'ansible_module_compression' from source: unknown 13830 1727204112.67558: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204112.67560: variable 'ansible_facts' from source: unknown 13830 1727204112.67618: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611/AnsiballZ_command.py 13830 1727204112.67729: Sending initial data 13830 1727204112.67733: Sent initial data (154 bytes) 13830 1727204112.68471: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204112.68481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.68492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.68513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.68631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.68634: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204112.68637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.68639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204112.68641: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204112.68642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204112.68644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.68647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.68649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.68651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.68652: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204112.68654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.68709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204112.68719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204112.68732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204112.68801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204112.70506: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204112.70556: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204112.70609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp20adanoo /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611/AnsiballZ_command.py <<< 13830 1727204112.70658: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204112.71594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204112.71693: stderr chunk (state=3): >>><<< 13830 1727204112.71706: stdout chunk (state=3): >>><<< 13830 1727204112.71729: done transferring module to remote 13830 1727204112.71741: _low_level_execute_command(): starting 13830 1727204112.71749: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611/ /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611/AnsiballZ_command.py && sleep 0' 13830 1727204112.72408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204112.72418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.72428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.72443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.72487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.72493: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204112.72503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.72517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204112.72524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204112.72530: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204112.72538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.72547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.72559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.72567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.72576: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204112.72591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.72662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204112.72682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204112.72701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204112.72771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204112.74450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204112.74512: stderr chunk (state=3): >>><<< 13830 1727204112.74516: stdout chunk (state=3): >>><<< 13830 1727204112.74533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204112.74537: _low_level_execute_command(): starting 13830 1727204112.74543: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611/AnsiballZ_command.py && sleep 0' 13830 1727204112.75207: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204112.75212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.75215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.75217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.75220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.75222: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204112.75437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.75444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204112.75447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204112.75495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204112.75498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204112.75500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204112.75502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204112.75505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204112.75507: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204112.75508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204112.75510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204112.75512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204112.75514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204112.75527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.11173: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:55:12.884111", "end": "2024-09-24 14:55:14.110028", "delta": "0:00:01.225917", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204114.12561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204114.12567: stdout chunk (state=3): >>><<< 13830 1727204114.12569: stderr chunk (state=3): >>><<< 13830 1727204114.12745: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 616 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:55:12.884111", "end": "2024-09-24 14:55:14.110028", "delta": "0:00:01.225917", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204114.12753: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204114.12756: _low_level_execute_command(): starting 13830 1727204114.12758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204112.642422-17204-69873533515611/ > /dev/null 2>&1 && sleep 0' 13830 1727204114.14529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.14536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.14567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204114.14571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.14574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.14669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.14692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.14701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.16529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.16536: stdout chunk (state=3): >>><<< 13830 1727204114.16538: stderr chunk (state=3): >>><<< 13830 1727204114.16555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204114.16566: handler run complete 13830 1727204114.16587: Evaluated conditional (False): False 13830 1727204114.16596: attempt loop complete, returning result 13830 1727204114.16599: _execute() done 13830 1727204114.16601: dumping result to json 13830 1727204114.16608: done dumping result, returning 13830 1727204114.16616: done running TaskExecutor() for managed-node3/TASK: Create test interfaces [0affcd87-79f5-1659-6b02-000000000977] 13830 1727204114.16622: sending task result for task 0affcd87-79f5-1659-6b02-000000000977 13830 1727204114.16736: done sending task result for task 0affcd87-79f5-1659-6b02-000000000977 13830 1727204114.16739: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.225917", "end": "2024-09-24 14:55:14.110028", "rc": 0, "start": "2024-09-24 14:55:12.884111" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 616 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 616 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 13830 1727204114.16817: no more pending results, returning what we have 13830 1727204114.16821: results queue empty 13830 1727204114.16822: checking for any_errors_fatal 13830 1727204114.16836: done checking for any_errors_fatal 13830 1727204114.16837: checking for max_fail_percentage 13830 1727204114.16838: done checking for max_fail_percentage 13830 1727204114.16839: checking to see if all hosts have failed and the running result is not ok 13830 1727204114.16840: done checking to see if all hosts have failed 13830 1727204114.16840: getting the remaining hosts for this loop 13830 1727204114.16842: done getting the remaining hosts for this loop 13830 1727204114.16847: getting the next task for host managed-node3 13830 1727204114.16857: done getting next task for host managed-node3 13830 1727204114.16859: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13830 1727204114.16871: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204114.16875: getting variables 13830 1727204114.16877: in VariableManager get_vars() 13830 1727204114.16910: Calling all_inventory to load vars for managed-node3 13830 1727204114.16913: Calling groups_inventory to load vars for managed-node3 13830 1727204114.16915: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204114.16924: Calling all_plugins_play to load vars for managed-node3 13830 1727204114.16927: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204114.16929: Calling groups_plugins_play to load vars for managed-node3 13830 1727204114.19539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204114.21791: done with get_vars() 13830 1727204114.21821: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:14 -0400 (0:00:01.626) 0:00:47.297 ***** 13830 1727204114.21949: entering _queue_task() for managed-node3/include_tasks 13830 1727204114.22326: worker is 1 (out of 1 available) 13830 1727204114.22341: exiting _queue_task() for managed-node3/include_tasks 13830 1727204114.22353: done queuing things up, now waiting for results queue to drain 13830 1727204114.22355: waiting for pending results... 13830 1727204114.22677: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 13830 1727204114.22837: in run() - task 0affcd87-79f5-1659-6b02-00000000097e 13830 1727204114.22858: variable 'ansible_search_path' from source: unknown 13830 1727204114.22869: variable 'ansible_search_path' from source: unknown 13830 1727204114.22911: calling self._execute() 13830 1727204114.23044: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.23055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.23072: variable 'omit' from source: magic vars 13830 1727204114.23489: variable 'ansible_distribution_major_version' from source: facts 13830 1727204114.23507: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204114.23517: _execute() done 13830 1727204114.23524: dumping result to json 13830 1727204114.23534: done dumping result, returning 13830 1727204114.23545: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1659-6b02-00000000097e] 13830 1727204114.23560: sending task result for task 0affcd87-79f5-1659-6b02-00000000097e 13830 1727204114.23698: no more pending results, returning what we have 13830 1727204114.23704: in VariableManager get_vars() 13830 1727204114.23758: Calling all_inventory to load vars for managed-node3 13830 1727204114.23762: Calling groups_inventory to load vars for managed-node3 13830 1727204114.23766: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204114.23781: Calling all_plugins_play to load vars for managed-node3 13830 1727204114.23784: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204114.23787: Calling groups_plugins_play to load vars for managed-node3 13830 1727204114.25777: done sending task result for task 0affcd87-79f5-1659-6b02-00000000097e 13830 1727204114.25782: WORKER PROCESS EXITING 13830 1727204114.33983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204114.36584: done with get_vars() 13830 1727204114.36614: variable 'ansible_search_path' from source: unknown 13830 1727204114.36615: variable 'ansible_search_path' from source: unknown 13830 1727204114.36663: we have included files to process 13830 1727204114.36666: generating all_blocks data 13830 1727204114.36668: done generating all_blocks data 13830 1727204114.36672: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204114.36673: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204114.36675: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204114.36859: done processing included file 13830 1727204114.36861: iterating over new_blocks loaded from include file 13830 1727204114.36865: in VariableManager get_vars() 13830 1727204114.36890: done with get_vars() 13830 1727204114.36892: filtering new block on tags 13830 1727204114.36921: done filtering new block on tags 13830 1727204114.36923: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 13830 1727204114.36927: extending task lists for all hosts with included blocks 13830 1727204114.37150: done extending task lists 13830 1727204114.37151: done processing included files 13830 1727204114.37152: results queue empty 13830 1727204114.37153: checking for any_errors_fatal 13830 1727204114.37158: done checking for any_errors_fatal 13830 1727204114.37159: checking for max_fail_percentage 13830 1727204114.37160: done checking for max_fail_percentage 13830 1727204114.37161: checking to see if all hosts have failed and the running result is not ok 13830 1727204114.37162: done checking to see if all hosts have failed 13830 1727204114.37162: getting the remaining hosts for this loop 13830 1727204114.37166: done getting the remaining hosts for this loop 13830 1727204114.37168: getting the next task for host managed-node3 13830 1727204114.37173: done getting next task for host managed-node3 13830 1727204114.37179: ^ task is: TASK: Get stat for interface {{ interface }} 13830 1727204114.37183: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204114.37185: getting variables 13830 1727204114.37186: in VariableManager get_vars() 13830 1727204114.37197: Calling all_inventory to load vars for managed-node3 13830 1727204114.37203: Calling groups_inventory to load vars for managed-node3 13830 1727204114.37205: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204114.37212: Calling all_plugins_play to load vars for managed-node3 13830 1727204114.37214: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204114.37217: Calling groups_plugins_play to load vars for managed-node3 13830 1727204114.38879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204114.41079: done with get_vars() 13830 1727204114.41095: done getting variables 13830 1727204114.41205: variable 'interface' from source: task vars 13830 1727204114.41208: variable 'dhcp_interface1' from source: play vars 13830 1727204114.41250: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.193) 0:00:47.490 ***** 13830 1727204114.41277: entering _queue_task() for managed-node3/stat 13830 1727204114.41523: worker is 1 (out of 1 available) 13830 1727204114.41538: exiting _queue_task() for managed-node3/stat 13830 1727204114.41550: done queuing things up, now waiting for results queue to drain 13830 1727204114.41553: waiting for pending results... 13830 1727204114.41737: running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 13830 1727204114.41850: in run() - task 0affcd87-79f5-1659-6b02-0000000009dd 13830 1727204114.41861: variable 'ansible_search_path' from source: unknown 13830 1727204114.41866: variable 'ansible_search_path' from source: unknown 13830 1727204114.41896: calling self._execute() 13830 1727204114.41974: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.41979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.41988: variable 'omit' from source: magic vars 13830 1727204114.42528: variable 'ansible_distribution_major_version' from source: facts 13830 1727204114.42550: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204114.42565: variable 'omit' from source: magic vars 13830 1727204114.42652: variable 'omit' from source: magic vars 13830 1727204114.42766: variable 'interface' from source: task vars 13830 1727204114.42778: variable 'dhcp_interface1' from source: play vars 13830 1727204114.42858: variable 'dhcp_interface1' from source: play vars 13830 1727204114.42883: variable 'omit' from source: magic vars 13830 1727204114.42945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204114.42990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204114.43030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204114.43072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204114.43088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204114.43143: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204114.43153: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.43160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.43274: Set connection var ansible_connection to ssh 13830 1727204114.43289: Set connection var ansible_timeout to 10 13830 1727204114.43298: Set connection var ansible_shell_executable to /bin/sh 13830 1727204114.43304: Set connection var ansible_shell_type to sh 13830 1727204114.43313: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204114.43334: Set connection var ansible_pipelining to False 13830 1727204114.43369: variable 'ansible_shell_executable' from source: unknown 13830 1727204114.43380: variable 'ansible_connection' from source: unknown 13830 1727204114.43387: variable 'ansible_module_compression' from source: unknown 13830 1727204114.43393: variable 'ansible_shell_type' from source: unknown 13830 1727204114.43398: variable 'ansible_shell_executable' from source: unknown 13830 1727204114.43404: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.43410: variable 'ansible_pipelining' from source: unknown 13830 1727204114.43416: variable 'ansible_timeout' from source: unknown 13830 1727204114.43423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.43714: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204114.43737: variable 'omit' from source: magic vars 13830 1727204114.43751: starting attempt loop 13830 1727204114.43768: running the handler 13830 1727204114.43802: _low_level_execute_command(): starting 13830 1727204114.43817: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204114.45641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.45648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.45756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204114.45762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.45771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204114.45774: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204114.45777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.45808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.45912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.45915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.45920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.45986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.47705: stdout chunk (state=3): >>>/root <<< 13830 1727204114.47814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.47869: stderr chunk (state=3): >>><<< 13830 1727204114.47879: stdout chunk (state=3): >>><<< 13830 1727204114.47913: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204114.47917: _low_level_execute_command(): starting 13830 1727204114.47925: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510 `" && echo ansible-tmp-1727204114.4790554-17333-253633215055510="` echo /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510 `" ) && sleep 0' 13830 1727204114.48555: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.48570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.48615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204114.48621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204114.48634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.48644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204114.48649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.48731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.48745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.48750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.48832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.50814: stdout chunk (state=3): >>>ansible-tmp-1727204114.4790554-17333-253633215055510=/root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510 <<< 13830 1727204114.50933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.51016: stderr chunk (state=3): >>><<< 13830 1727204114.51019: stdout chunk (state=3): >>><<< 13830 1727204114.51035: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204114.4790554-17333-253633215055510=/root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204114.51081: variable 'ansible_module_compression' from source: unknown 13830 1727204114.51137: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204114.51195: variable 'ansible_facts' from source: unknown 13830 1727204114.51287: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510/AnsiballZ_stat.py 13830 1727204114.51445: Sending initial data 13830 1727204114.51449: Sent initial data (153 bytes) 13830 1727204114.52200: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.52204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.52242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.52245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.52249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.52299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.52305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.52365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.54117: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204114.54175: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 13830 1727204114.54187: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 13830 1727204114.54196: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 13830 1727204114.54291: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpxk_sil0h /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510/AnsiballZ_stat.py <<< 13830 1727204114.54321: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204114.55217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.55382: stderr chunk (state=3): >>><<< 13830 1727204114.55386: stdout chunk (state=3): >>><<< 13830 1727204114.55409: done transferring module to remote 13830 1727204114.55416: _low_level_execute_command(): starting 13830 1727204114.55421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510/ /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510/AnsiballZ_stat.py && sleep 0' 13830 1727204114.56074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204114.56082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.56092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.56108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.56148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204114.56154: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204114.56166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.56190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204114.56193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204114.56202: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204114.56220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.56223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.56241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.56251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204114.56254: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204114.56262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.56338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.56362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.56378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.56440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.58144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.58219: stderr chunk (state=3): >>><<< 13830 1727204114.58222: stdout chunk (state=3): >>><<< 13830 1727204114.58231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204114.58237: _low_level_execute_command(): starting 13830 1727204114.58250: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510/AnsiballZ_stat.py && sleep 0' 13830 1727204114.58740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.58745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.58784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.58787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.58789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.58845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.58848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.58851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.58904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.72066: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28211, "dev": 21, "nlink": 1, "atime": 1727204112.8937595, "mtime": 1727204112.8937595, "ctime": 1727204112.8937595, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204114.73220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204114.73279: stderr chunk (state=3): >>><<< 13830 1727204114.73282: stdout chunk (state=3): >>><<< 13830 1727204114.73299: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28211, "dev": 21, "nlink": 1, "atime": 1727204112.8937595, "mtime": 1727204112.8937595, "ctime": 1727204112.8937595, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204114.73344: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204114.73353: _low_level_execute_command(): starting 13830 1727204114.73356: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204114.4790554-17333-253633215055510/ > /dev/null 2>&1 && sleep 0' 13830 1727204114.73834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.73842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.73899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204114.73902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13830 1727204114.73905: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.73907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204114.73909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.73948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.73961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.74022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.75890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.75947: stderr chunk (state=3): >>><<< 13830 1727204114.75951: stdout chunk (state=3): >>><<< 13830 1727204114.75965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204114.75972: handler run complete 13830 1727204114.76007: attempt loop complete, returning result 13830 1727204114.76011: _execute() done 13830 1727204114.76014: dumping result to json 13830 1727204114.76020: done dumping result, returning 13830 1727204114.76029: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 [0affcd87-79f5-1659-6b02-0000000009dd] 13830 1727204114.76036: sending task result for task 0affcd87-79f5-1659-6b02-0000000009dd 13830 1727204114.76146: done sending task result for task 0affcd87-79f5-1659-6b02-0000000009dd 13830 1727204114.76148: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204112.8937595, "block_size": 4096, "blocks": 0, "ctime": 1727204112.8937595, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28211, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204112.8937595, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13830 1727204114.76236: no more pending results, returning what we have 13830 1727204114.76240: results queue empty 13830 1727204114.76241: checking for any_errors_fatal 13830 1727204114.76243: done checking for any_errors_fatal 13830 1727204114.76244: checking for max_fail_percentage 13830 1727204114.76245: done checking for max_fail_percentage 13830 1727204114.76246: checking to see if all hosts have failed and the running result is not ok 13830 1727204114.76247: done checking to see if all hosts have failed 13830 1727204114.76248: getting the remaining hosts for this loop 13830 1727204114.76250: done getting the remaining hosts for this loop 13830 1727204114.76254: getting the next task for host managed-node3 13830 1727204114.76264: done getting next task for host managed-node3 13830 1727204114.76267: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13830 1727204114.76275: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204114.76278: getting variables 13830 1727204114.76280: in VariableManager get_vars() 13830 1727204114.76314: Calling all_inventory to load vars for managed-node3 13830 1727204114.76317: Calling groups_inventory to load vars for managed-node3 13830 1727204114.76319: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204114.76329: Calling all_plugins_play to load vars for managed-node3 13830 1727204114.76334: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204114.76337: Calling groups_plugins_play to load vars for managed-node3 13830 1727204114.77160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204114.78109: done with get_vars() 13830 1727204114.78127: done getting variables 13830 1727204114.78177: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204114.78277: variable 'interface' from source: task vars 13830 1727204114.78280: variable 'dhcp_interface1' from source: play vars 13830 1727204114.78321: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.370) 0:00:47.861 ***** 13830 1727204114.78351: entering _queue_task() for managed-node3/assert 13830 1727204114.78588: worker is 1 (out of 1 available) 13830 1727204114.78601: exiting _queue_task() for managed-node3/assert 13830 1727204114.78613: done queuing things up, now waiting for results queue to drain 13830 1727204114.78615: waiting for pending results... 13830 1727204114.78796: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' 13830 1727204114.78893: in run() - task 0affcd87-79f5-1659-6b02-00000000097f 13830 1727204114.78904: variable 'ansible_search_path' from source: unknown 13830 1727204114.78907: variable 'ansible_search_path' from source: unknown 13830 1727204114.78938: calling self._execute() 13830 1727204114.79010: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.79014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.79023: variable 'omit' from source: magic vars 13830 1727204114.79288: variable 'ansible_distribution_major_version' from source: facts 13830 1727204114.79298: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204114.79304: variable 'omit' from source: magic vars 13830 1727204114.79351: variable 'omit' from source: magic vars 13830 1727204114.79421: variable 'interface' from source: task vars 13830 1727204114.79425: variable 'dhcp_interface1' from source: play vars 13830 1727204114.79473: variable 'dhcp_interface1' from source: play vars 13830 1727204114.79489: variable 'omit' from source: magic vars 13830 1727204114.79524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204114.79554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204114.79572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204114.79585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204114.79594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204114.79618: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204114.79621: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.79624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.79694: Set connection var ansible_connection to ssh 13830 1727204114.79702: Set connection var ansible_timeout to 10 13830 1727204114.79708: Set connection var ansible_shell_executable to /bin/sh 13830 1727204114.79711: Set connection var ansible_shell_type to sh 13830 1727204114.79716: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204114.79725: Set connection var ansible_pipelining to False 13830 1727204114.79742: variable 'ansible_shell_executable' from source: unknown 13830 1727204114.79745: variable 'ansible_connection' from source: unknown 13830 1727204114.79747: variable 'ansible_module_compression' from source: unknown 13830 1727204114.79750: variable 'ansible_shell_type' from source: unknown 13830 1727204114.79753: variable 'ansible_shell_executable' from source: unknown 13830 1727204114.79756: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.79758: variable 'ansible_pipelining' from source: unknown 13830 1727204114.79761: variable 'ansible_timeout' from source: unknown 13830 1727204114.79765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.79863: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204114.79876: variable 'omit' from source: magic vars 13830 1727204114.79879: starting attempt loop 13830 1727204114.79882: running the handler 13830 1727204114.79974: variable 'interface_stat' from source: set_fact 13830 1727204114.79999: Evaluated conditional (interface_stat.stat.exists): True 13830 1727204114.80002: handler run complete 13830 1727204114.80011: attempt loop complete, returning result 13830 1727204114.80013: _execute() done 13830 1727204114.80016: dumping result to json 13830 1727204114.80019: done dumping result, returning 13830 1727204114.80024: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' [0affcd87-79f5-1659-6b02-00000000097f] 13830 1727204114.80029: sending task result for task 0affcd87-79f5-1659-6b02-00000000097f 13830 1727204114.80118: done sending task result for task 0affcd87-79f5-1659-6b02-00000000097f 13830 1727204114.80121: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204114.80199: no more pending results, returning what we have 13830 1727204114.80203: results queue empty 13830 1727204114.80204: checking for any_errors_fatal 13830 1727204114.80217: done checking for any_errors_fatal 13830 1727204114.80218: checking for max_fail_percentage 13830 1727204114.80219: done checking for max_fail_percentage 13830 1727204114.80220: checking to see if all hosts have failed and the running result is not ok 13830 1727204114.80221: done checking to see if all hosts have failed 13830 1727204114.80221: getting the remaining hosts for this loop 13830 1727204114.80223: done getting the remaining hosts for this loop 13830 1727204114.80226: getting the next task for host managed-node3 13830 1727204114.80236: done getting next task for host managed-node3 13830 1727204114.80239: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13830 1727204114.80244: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204114.80247: getting variables 13830 1727204114.80248: in VariableManager get_vars() 13830 1727204114.80281: Calling all_inventory to load vars for managed-node3 13830 1727204114.80284: Calling groups_inventory to load vars for managed-node3 13830 1727204114.80286: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204114.80295: Calling all_plugins_play to load vars for managed-node3 13830 1727204114.80297: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204114.80300: Calling groups_plugins_play to load vars for managed-node3 13830 1727204114.81226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204114.82146: done with get_vars() 13830 1727204114.82167: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.038) 0:00:47.900 ***** 13830 1727204114.82244: entering _queue_task() for managed-node3/include_tasks 13830 1727204114.82486: worker is 1 (out of 1 available) 13830 1727204114.82500: exiting _queue_task() for managed-node3/include_tasks 13830 1727204114.82513: done queuing things up, now waiting for results queue to drain 13830 1727204114.82515: waiting for pending results... 13830 1727204114.82697: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 13830 1727204114.82795: in run() - task 0affcd87-79f5-1659-6b02-000000000983 13830 1727204114.82805: variable 'ansible_search_path' from source: unknown 13830 1727204114.82811: variable 'ansible_search_path' from source: unknown 13830 1727204114.82843: calling self._execute() 13830 1727204114.82921: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.82925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.82940: variable 'omit' from source: magic vars 13830 1727204114.83207: variable 'ansible_distribution_major_version' from source: facts 13830 1727204114.83219: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204114.83224: _execute() done 13830 1727204114.83227: dumping result to json 13830 1727204114.83233: done dumping result, returning 13830 1727204114.83237: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-1659-6b02-000000000983] 13830 1727204114.83243: sending task result for task 0affcd87-79f5-1659-6b02-000000000983 13830 1727204114.83341: done sending task result for task 0affcd87-79f5-1659-6b02-000000000983 13830 1727204114.83344: WORKER PROCESS EXITING 13830 1727204114.83377: no more pending results, returning what we have 13830 1727204114.83383: in VariableManager get_vars() 13830 1727204114.83427: Calling all_inventory to load vars for managed-node3 13830 1727204114.83429: Calling groups_inventory to load vars for managed-node3 13830 1727204114.83434: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204114.83449: Calling all_plugins_play to load vars for managed-node3 13830 1727204114.83452: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204114.83454: Calling groups_plugins_play to load vars for managed-node3 13830 1727204114.84262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204114.85203: done with get_vars() 13830 1727204114.85219: variable 'ansible_search_path' from source: unknown 13830 1727204114.85221: variable 'ansible_search_path' from source: unknown 13830 1727204114.85248: we have included files to process 13830 1727204114.85249: generating all_blocks data 13830 1727204114.85250: done generating all_blocks data 13830 1727204114.85253: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204114.85254: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204114.85255: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13830 1727204114.85390: done processing included file 13830 1727204114.85392: iterating over new_blocks loaded from include file 13830 1727204114.85393: in VariableManager get_vars() 13830 1727204114.85406: done with get_vars() 13830 1727204114.85407: filtering new block on tags 13830 1727204114.85427: done filtering new block on tags 13830 1727204114.85429: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 13830 1727204114.85435: extending task lists for all hosts with included blocks 13830 1727204114.85562: done extending task lists 13830 1727204114.85565: done processing included files 13830 1727204114.85566: results queue empty 13830 1727204114.85566: checking for any_errors_fatal 13830 1727204114.85569: done checking for any_errors_fatal 13830 1727204114.85569: checking for max_fail_percentage 13830 1727204114.85570: done checking for max_fail_percentage 13830 1727204114.85571: checking to see if all hosts have failed and the running result is not ok 13830 1727204114.85571: done checking to see if all hosts have failed 13830 1727204114.85571: getting the remaining hosts for this loop 13830 1727204114.85572: done getting the remaining hosts for this loop 13830 1727204114.85574: getting the next task for host managed-node3 13830 1727204114.85577: done getting next task for host managed-node3 13830 1727204114.85579: ^ task is: TASK: Get stat for interface {{ interface }} 13830 1727204114.85581: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204114.85583: getting variables 13830 1727204114.85584: in VariableManager get_vars() 13830 1727204114.85592: Calling all_inventory to load vars for managed-node3 13830 1727204114.85593: Calling groups_inventory to load vars for managed-node3 13830 1727204114.85594: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204114.85598: Calling all_plugins_play to load vars for managed-node3 13830 1727204114.85599: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204114.85601: Calling groups_plugins_play to load vars for managed-node3 13830 1727204114.86337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204114.87250: done with get_vars() 13830 1727204114.87269: done getting variables 13830 1727204114.87387: variable 'interface' from source: task vars 13830 1727204114.87390: variable 'dhcp_interface2' from source: play vars 13830 1727204114.87430: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.052) 0:00:47.952 ***** 13830 1727204114.87457: entering _queue_task() for managed-node3/stat 13830 1727204114.87710: worker is 1 (out of 1 available) 13830 1727204114.87725: exiting _queue_task() for managed-node3/stat 13830 1727204114.87738: done queuing things up, now waiting for results queue to drain 13830 1727204114.87740: waiting for pending results... 13830 1727204114.87921: running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 13830 1727204114.88025: in run() - task 0affcd87-79f5-1659-6b02-000000000a01 13830 1727204114.88043: variable 'ansible_search_path' from source: unknown 13830 1727204114.88047: variable 'ansible_search_path' from source: unknown 13830 1727204114.88073: calling self._execute() 13830 1727204114.88144: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.88148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.88156: variable 'omit' from source: magic vars 13830 1727204114.88419: variable 'ansible_distribution_major_version' from source: facts 13830 1727204114.88429: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204114.88436: variable 'omit' from source: magic vars 13830 1727204114.88489: variable 'omit' from source: magic vars 13830 1727204114.88559: variable 'interface' from source: task vars 13830 1727204114.88566: variable 'dhcp_interface2' from source: play vars 13830 1727204114.88615: variable 'dhcp_interface2' from source: play vars 13830 1727204114.88629: variable 'omit' from source: magic vars 13830 1727204114.88667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204114.88696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204114.88713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204114.88726: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204114.88741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204114.88765: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204114.88768: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.88771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.88843: Set connection var ansible_connection to ssh 13830 1727204114.88852: Set connection var ansible_timeout to 10 13830 1727204114.88857: Set connection var ansible_shell_executable to /bin/sh 13830 1727204114.88860: Set connection var ansible_shell_type to sh 13830 1727204114.88866: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204114.88875: Set connection var ansible_pipelining to False 13830 1727204114.88891: variable 'ansible_shell_executable' from source: unknown 13830 1727204114.88893: variable 'ansible_connection' from source: unknown 13830 1727204114.88896: variable 'ansible_module_compression' from source: unknown 13830 1727204114.88898: variable 'ansible_shell_type' from source: unknown 13830 1727204114.88900: variable 'ansible_shell_executable' from source: unknown 13830 1727204114.88903: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204114.88906: variable 'ansible_pipelining' from source: unknown 13830 1727204114.88909: variable 'ansible_timeout' from source: unknown 13830 1727204114.88915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204114.89069: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204114.89077: variable 'omit' from source: magic vars 13830 1727204114.89082: starting attempt loop 13830 1727204114.89085: running the handler 13830 1727204114.89097: _low_level_execute_command(): starting 13830 1727204114.89103: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204114.89634: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.89646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.89676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.89693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.89704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.89755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.89762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.89776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.89856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.91540: stdout chunk (state=3): >>>/root <<< 13830 1727204114.91635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.91692: stderr chunk (state=3): >>><<< 13830 1727204114.91696: stdout chunk (state=3): >>><<< 13830 1727204114.91720: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204114.91734: _low_level_execute_command(): starting 13830 1727204114.91739: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026 `" && echo ansible-tmp-1727204114.9171925-17345-205583083268026="` echo /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026 `" ) && sleep 0' 13830 1727204114.92203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.92216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.92241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.92255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.92303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.92315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.92365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.94225: stdout chunk (state=3): >>>ansible-tmp-1727204114.9171925-17345-205583083268026=/root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026 <<< 13830 1727204114.94342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.94404: stderr chunk (state=3): >>><<< 13830 1727204114.94407: stdout chunk (state=3): >>><<< 13830 1727204114.94434: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204114.9171925-17345-205583083268026=/root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204114.94479: variable 'ansible_module_compression' from source: unknown 13830 1727204114.94536: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13830 1727204114.94568: variable 'ansible_facts' from source: unknown 13830 1727204114.94629: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026/AnsiballZ_stat.py 13830 1727204114.94753: Sending initial data 13830 1727204114.94757: Sent initial data (153 bytes) 13830 1727204114.95543: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.95547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.95557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.95597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.95919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204114.96095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204114.96187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204114.96222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204114.96237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204114.96249: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204114.96265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204114.96345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204114.96357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204114.96369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204114.96419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204114.98485: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204114.98489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpn1gxr0k_ /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026/AnsiballZ_stat.py <<< 13830 1727204114.98515: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204114.99820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204114.99906: stderr chunk (state=3): >>><<< 13830 1727204114.99916: stdout chunk (state=3): >>><<< 13830 1727204114.99970: done transferring module to remote 13830 1727204114.99973: _low_level_execute_command(): starting 13830 1727204114.99975: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026/ /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026/AnsiballZ_stat.py && sleep 0' 13830 1727204115.01191: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204115.01208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204115.01225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204115.01247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204115.01287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204115.01299: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204115.01325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204115.01347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204115.01359: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204115.01373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204115.01390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204115.01403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204115.01419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204115.01434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204115.01446: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204115.01459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204115.01545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204115.01573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204115.01589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204115.01658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204115.03497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204115.03500: stdout chunk (state=3): >>><<< 13830 1727204115.03502: stderr chunk (state=3): >>><<< 13830 1727204115.03598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204115.03602: _low_level_execute_command(): starting 13830 1727204115.03604: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026/AnsiballZ_stat.py && sleep 0' 13830 1727204115.04224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204115.04236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204115.04243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204115.04257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204115.04302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204115.04309: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204115.04318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204115.04334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204115.04337: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204115.04345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204115.04353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204115.04362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204115.04376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204115.04385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204115.04392: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204115.04404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204115.04486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204115.04505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204115.04512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204115.04987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204115.18393: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28797, "dev": 21, "nlink": 1, "atime": 1727204112.9002738, "mtime": 1727204112.9002738, "ctime": 1727204112.9002738, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13830 1727204115.19588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204115.19594: stderr chunk (state=3): >>><<< 13830 1727204115.19597: stdout chunk (state=3): >>><<< 13830 1727204115.19621: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28797, "dev": 21, "nlink": 1, "atime": 1727204112.9002738, "mtime": 1727204112.9002738, "ctime": 1727204112.9002738, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204115.19676: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204115.19685: _low_level_execute_command(): starting 13830 1727204115.19691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204114.9171925-17345-205583083268026/ > /dev/null 2>&1 && sleep 0' 13830 1727204115.21123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204115.21129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204115.21140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204115.21154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204115.21195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204115.21233: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204115.21237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204115.21240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204115.21252: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204115.21255: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204115.21258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204115.21279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204115.21282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204115.21289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204115.21295: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204115.21306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204115.21379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204115.21394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204115.21403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204115.21490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204115.23559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204115.23565: stdout chunk (state=3): >>><<< 13830 1727204115.23568: stderr chunk (state=3): >>><<< 13830 1727204115.23571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204115.23573: handler run complete 13830 1727204115.23575: attempt loop complete, returning result 13830 1727204115.23577: _execute() done 13830 1727204115.23579: dumping result to json 13830 1727204115.23581: done dumping result, returning 13830 1727204115.23583: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 [0affcd87-79f5-1659-6b02-000000000a01] 13830 1727204115.23585: sending task result for task 0affcd87-79f5-1659-6b02-000000000a01 13830 1727204115.23660: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a01 13830 1727204115.23663: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204112.9002738, "block_size": 4096, "blocks": 0, "ctime": 1727204112.9002738, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28797, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204112.9002738, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13830 1727204115.23750: no more pending results, returning what we have 13830 1727204115.23754: results queue empty 13830 1727204115.23754: checking for any_errors_fatal 13830 1727204115.23756: done checking for any_errors_fatal 13830 1727204115.23757: checking for max_fail_percentage 13830 1727204115.23758: done checking for max_fail_percentage 13830 1727204115.23759: checking to see if all hosts have failed and the running result is not ok 13830 1727204115.23760: done checking to see if all hosts have failed 13830 1727204115.23761: getting the remaining hosts for this loop 13830 1727204115.23762: done getting the remaining hosts for this loop 13830 1727204115.23768: getting the next task for host managed-node3 13830 1727204115.23776: done getting next task for host managed-node3 13830 1727204115.23779: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13830 1727204115.23784: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204115.23792: getting variables 13830 1727204115.23794: in VariableManager get_vars() 13830 1727204115.23827: Calling all_inventory to load vars for managed-node3 13830 1727204115.23830: Calling groups_inventory to load vars for managed-node3 13830 1727204115.23834: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204115.23844: Calling all_plugins_play to load vars for managed-node3 13830 1727204115.23846: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204115.23848: Calling groups_plugins_play to load vars for managed-node3 13830 1727204115.26722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204115.29890: done with get_vars() 13830 1727204115.29920: done getting variables 13830 1727204115.30908: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204115.31157: variable 'interface' from source: task vars 13830 1727204115.31162: variable 'dhcp_interface2' from source: play vars 13830 1727204115.31801: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.445) 0:00:48.398 ***** 13830 1727204115.32073: entering _queue_task() for managed-node3/assert 13830 1727204115.32619: worker is 1 (out of 1 available) 13830 1727204115.32634: exiting _queue_task() for managed-node3/assert 13830 1727204115.32761: done queuing things up, now waiting for results queue to drain 13830 1727204115.32763: waiting for pending results... 13830 1727204115.35392: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' 13830 1727204115.35546: in run() - task 0affcd87-79f5-1659-6b02-000000000984 13830 1727204115.35562: variable 'ansible_search_path' from source: unknown 13830 1727204115.35568: variable 'ansible_search_path' from source: unknown 13830 1727204115.35631: calling self._execute() 13830 1727204115.35739: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204115.35747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204115.35759: variable 'omit' from source: magic vars 13830 1727204115.36151: variable 'ansible_distribution_major_version' from source: facts 13830 1727204115.36170: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204115.36177: variable 'omit' from source: magic vars 13830 1727204115.36518: variable 'omit' from source: magic vars 13830 1727204115.36622: variable 'interface' from source: task vars 13830 1727204115.36626: variable 'dhcp_interface2' from source: play vars 13830 1727204115.36701: variable 'dhcp_interface2' from source: play vars 13830 1727204115.36718: variable 'omit' from source: magic vars 13830 1727204115.36771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204115.36924: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204115.36946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204115.36963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204115.36975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204115.37122: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204115.37125: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204115.37128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204115.37373: Set connection var ansible_connection to ssh 13830 1727204115.37377: Set connection var ansible_timeout to 10 13830 1727204115.37379: Set connection var ansible_shell_executable to /bin/sh 13830 1727204115.37381: Set connection var ansible_shell_type to sh 13830 1727204115.37571: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204115.37575: Set connection var ansible_pipelining to False 13830 1727204115.37577: variable 'ansible_shell_executable' from source: unknown 13830 1727204115.37579: variable 'ansible_connection' from source: unknown 13830 1727204115.37581: variable 'ansible_module_compression' from source: unknown 13830 1727204115.37583: variable 'ansible_shell_type' from source: unknown 13830 1727204115.37585: variable 'ansible_shell_executable' from source: unknown 13830 1727204115.37586: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204115.37588: variable 'ansible_pipelining' from source: unknown 13830 1727204115.37591: variable 'ansible_timeout' from source: unknown 13830 1727204115.37593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204115.37738: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204115.37748: variable 'omit' from source: magic vars 13830 1727204115.37753: starting attempt loop 13830 1727204115.37861: running the handler 13830 1727204115.38328: variable 'interface_stat' from source: set_fact 13830 1727204115.38348: Evaluated conditional (interface_stat.stat.exists): True 13830 1727204115.38353: handler run complete 13830 1727204115.38369: attempt loop complete, returning result 13830 1727204115.38372: _execute() done 13830 1727204115.38374: dumping result to json 13830 1727204115.38377: done dumping result, returning 13830 1727204115.38384: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' [0affcd87-79f5-1659-6b02-000000000984] 13830 1727204115.38389: sending task result for task 0affcd87-79f5-1659-6b02-000000000984 13830 1727204115.38634: done sending task result for task 0affcd87-79f5-1659-6b02-000000000984 13830 1727204115.38637: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 13830 1727204115.38704: no more pending results, returning what we have 13830 1727204115.38708: results queue empty 13830 1727204115.38709: checking for any_errors_fatal 13830 1727204115.38720: done checking for any_errors_fatal 13830 1727204115.38720: checking for max_fail_percentage 13830 1727204115.38722: done checking for max_fail_percentage 13830 1727204115.38723: checking to see if all hosts have failed and the running result is not ok 13830 1727204115.38724: done checking to see if all hosts have failed 13830 1727204115.38725: getting the remaining hosts for this loop 13830 1727204115.38726: done getting the remaining hosts for this loop 13830 1727204115.38730: getting the next task for host managed-node3 13830 1727204115.38740: done getting next task for host managed-node3 13830 1727204115.38744: ^ task is: TASK: Test 13830 1727204115.38747: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204115.38751: getting variables 13830 1727204115.38753: in VariableManager get_vars() 13830 1727204115.38795: Calling all_inventory to load vars for managed-node3 13830 1727204115.38798: Calling groups_inventory to load vars for managed-node3 13830 1727204115.38800: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204115.38812: Calling all_plugins_play to load vars for managed-node3 13830 1727204115.38814: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204115.38817: Calling groups_plugins_play to load vars for managed-node3 13830 1727204115.40304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204115.45006: done with get_vars() 13830 1727204115.45043: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.131) 0:00:48.529 ***** 13830 1727204115.45177: entering _queue_task() for managed-node3/include_tasks 13830 1727204115.45637: worker is 1 (out of 1 available) 13830 1727204115.45653: exiting _queue_task() for managed-node3/include_tasks 13830 1727204115.45668: done queuing things up, now waiting for results queue to drain 13830 1727204115.45669: waiting for pending results... 13830 1727204115.46081: running TaskExecutor() for managed-node3/TASK: Test 13830 1727204115.46101: in run() - task 0affcd87-79f5-1659-6b02-0000000008ee 13830 1727204115.46121: variable 'ansible_search_path' from source: unknown 13830 1727204115.46127: variable 'ansible_search_path' from source: unknown 13830 1727204115.46179: variable 'lsr_test' from source: include params 13830 1727204115.46385: variable 'lsr_test' from source: include params 13830 1727204115.46459: variable 'omit' from source: magic vars 13830 1727204115.46618: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204115.46631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204115.46651: variable 'omit' from source: magic vars 13830 1727204115.46886: variable 'ansible_distribution_major_version' from source: facts 13830 1727204115.46900: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204115.46909: variable 'item' from source: unknown 13830 1727204115.46980: variable 'item' from source: unknown 13830 1727204115.47012: variable 'item' from source: unknown 13830 1727204115.47076: variable 'item' from source: unknown 13830 1727204115.47227: dumping result to json 13830 1727204115.47239: done dumping result, returning 13830 1727204115.47249: done running TaskExecutor() for managed-node3/TASK: Test [0affcd87-79f5-1659-6b02-0000000008ee] 13830 1727204115.47258: sending task result for task 0affcd87-79f5-1659-6b02-0000000008ee 13830 1727204115.47349: no more pending results, returning what we have 13830 1727204115.47354: in VariableManager get_vars() 13830 1727204115.47401: Calling all_inventory to load vars for managed-node3 13830 1727204115.47404: Calling groups_inventory to load vars for managed-node3 13830 1727204115.47407: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204115.47421: Calling all_plugins_play to load vars for managed-node3 13830 1727204115.47426: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204115.47438: Calling groups_plugins_play to load vars for managed-node3 13830 1727204115.48992: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008ee 13830 1727204115.48996: WORKER PROCESS EXITING 13830 1727204115.50457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204115.52386: done with get_vars() 13830 1727204115.52424: variable 'ansible_search_path' from source: unknown 13830 1727204115.52425: variable 'ansible_search_path' from source: unknown 13830 1727204115.52494: we have included files to process 13830 1727204115.52496: generating all_blocks data 13830 1727204115.52499: done generating all_blocks data 13830 1727204115.52504: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 13830 1727204115.52511: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 13830 1727204115.52518: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 13830 1727204115.52916: in VariableManager get_vars() 13830 1727204115.52944: done with get_vars() 13830 1727204115.52950: variable 'omit' from source: magic vars 13830 1727204115.53003: variable 'omit' from source: magic vars 13830 1727204115.53065: in VariableManager get_vars() 13830 1727204115.53081: done with get_vars() 13830 1727204115.53107: in VariableManager get_vars() 13830 1727204115.53126: done with get_vars() 13830 1727204115.53188: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13830 1727204115.53452: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13830 1727204115.53601: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13830 1727204115.54139: in VariableManager get_vars() 13830 1727204115.54187: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204115.57360: done processing included file 13830 1727204115.57362: iterating over new_blocks loaded from include file 13830 1727204115.57366: in VariableManager get_vars() 13830 1727204115.57403: done with get_vars() 13830 1727204115.57405: filtering new block on tags 13830 1727204115.57872: done filtering new block on tags 13830 1727204115.57876: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml for managed-node3 => (item=tasks/create_bond_profile_reconfigure.yml) 13830 1727204115.57882: extending task lists for all hosts with included blocks 13830 1727204115.59835: done extending task lists 13830 1727204115.59837: done processing included files 13830 1727204115.59843: results queue empty 13830 1727204115.59844: checking for any_errors_fatal 13830 1727204115.59849: done checking for any_errors_fatal 13830 1727204115.59849: checking for max_fail_percentage 13830 1727204115.59851: done checking for max_fail_percentage 13830 1727204115.59852: checking to see if all hosts have failed and the running result is not ok 13830 1727204115.59853: done checking to see if all hosts have failed 13830 1727204115.59853: getting the remaining hosts for this loop 13830 1727204115.59855: done getting the remaining hosts for this loop 13830 1727204115.59860: getting the next task for host managed-node3 13830 1727204115.59867: done getting next task for host managed-node3 13830 1727204115.59870: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204115.59880: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204115.59897: getting variables 13830 1727204115.59900: in VariableManager get_vars() 13830 1727204115.59932: Calling all_inventory to load vars for managed-node3 13830 1727204115.59938: Calling groups_inventory to load vars for managed-node3 13830 1727204115.59940: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204115.59949: Calling all_plugins_play to load vars for managed-node3 13830 1727204115.59963: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204115.59969: Calling groups_plugins_play to load vars for managed-node3 13830 1727204115.61663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204115.63684: done with get_vars() 13830 1727204115.63740: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.187) 0:00:48.716 ***** 13830 1727204115.63887: entering _queue_task() for managed-node3/include_tasks 13830 1727204115.64307: worker is 1 (out of 1 available) 13830 1727204115.64320: exiting _queue_task() for managed-node3/include_tasks 13830 1727204115.64333: done queuing things up, now waiting for results queue to drain 13830 1727204115.64335: waiting for pending results... 13830 1727204115.64781: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204115.65059: in run() - task 0affcd87-79f5-1659-6b02-000000000a2e 13830 1727204115.65096: variable 'ansible_search_path' from source: unknown 13830 1727204115.65108: variable 'ansible_search_path' from source: unknown 13830 1727204115.65166: calling self._execute() 13830 1727204115.65325: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204115.65341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204115.65373: variable 'omit' from source: magic vars 13830 1727204115.65848: variable 'ansible_distribution_major_version' from source: facts 13830 1727204115.65886: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204115.65900: _execute() done 13830 1727204115.65908: dumping result to json 13830 1727204115.65915: done dumping result, returning 13830 1727204115.65926: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1659-6b02-000000000a2e] 13830 1727204115.65949: sending task result for task 0affcd87-79f5-1659-6b02-000000000a2e 13830 1727204115.66153: no more pending results, returning what we have 13830 1727204115.66160: in VariableManager get_vars() 13830 1727204115.66224: Calling all_inventory to load vars for managed-node3 13830 1727204115.66228: Calling groups_inventory to load vars for managed-node3 13830 1727204115.66230: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204115.66248: Calling all_plugins_play to load vars for managed-node3 13830 1727204115.66253: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204115.66256: Calling groups_plugins_play to load vars for managed-node3 13830 1727204115.67361: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a2e 13830 1727204115.67366: WORKER PROCESS EXITING 13830 1727204115.68186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204115.71588: done with get_vars() 13830 1727204115.71736: variable 'ansible_search_path' from source: unknown 13830 1727204115.71737: variable 'ansible_search_path' from source: unknown 13830 1727204115.71784: we have included files to process 13830 1727204115.71786: generating all_blocks data 13830 1727204115.71788: done generating all_blocks data 13830 1727204115.71790: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204115.71791: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204115.71793: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204115.73209: done processing included file 13830 1727204115.73212: iterating over new_blocks loaded from include file 13830 1727204115.73214: in VariableManager get_vars() 13830 1727204115.73252: done with get_vars() 13830 1727204115.73254: filtering new block on tags 13830 1727204115.73306: done filtering new block on tags 13830 1727204115.73309: in VariableManager get_vars() 13830 1727204115.73342: done with get_vars() 13830 1727204115.73344: filtering new block on tags 13830 1727204115.73399: done filtering new block on tags 13830 1727204115.73402: in VariableManager get_vars() 13830 1727204115.73429: done with get_vars() 13830 1727204115.73434: filtering new block on tags 13830 1727204115.73486: done filtering new block on tags 13830 1727204115.73489: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 13830 1727204115.73494: extending task lists for all hosts with included blocks 13830 1727204115.77026: done extending task lists 13830 1727204115.77028: done processing included files 13830 1727204115.77028: results queue empty 13830 1727204115.77029: checking for any_errors_fatal 13830 1727204115.77036: done checking for any_errors_fatal 13830 1727204115.77037: checking for max_fail_percentage 13830 1727204115.77038: done checking for max_fail_percentage 13830 1727204115.77039: checking to see if all hosts have failed and the running result is not ok 13830 1727204115.77040: done checking to see if all hosts have failed 13830 1727204115.77041: getting the remaining hosts for this loop 13830 1727204115.77042: done getting the remaining hosts for this loop 13830 1727204115.77044: getting the next task for host managed-node3 13830 1727204115.77050: done getting next task for host managed-node3 13830 1727204115.77052: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204115.77057: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204115.77174: getting variables 13830 1727204115.77179: in VariableManager get_vars() 13830 1727204115.77202: Calling all_inventory to load vars for managed-node3 13830 1727204115.77205: Calling groups_inventory to load vars for managed-node3 13830 1727204115.77207: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204115.77213: Calling all_plugins_play to load vars for managed-node3 13830 1727204115.77216: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204115.77219: Calling groups_plugins_play to load vars for managed-node3 13830 1727204115.80258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204115.83962: done with get_vars() 13830 1727204115.84204: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.204) 0:00:48.920 ***** 13830 1727204115.84290: entering _queue_task() for managed-node3/setup 13830 1727204115.85137: worker is 1 (out of 1 available) 13830 1727204115.85151: exiting _queue_task() for managed-node3/setup 13830 1727204115.85187: done queuing things up, now waiting for results queue to drain 13830 1727204115.85190: waiting for pending results... 13830 1727204115.86044: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204115.86466: in run() - task 0affcd87-79f5-1659-6b02-000000000b10 13830 1727204115.86482: variable 'ansible_search_path' from source: unknown 13830 1727204115.86486: variable 'ansible_search_path' from source: unknown 13830 1727204115.86579: calling self._execute() 13830 1727204115.86784: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204115.87081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204115.87095: variable 'omit' from source: magic vars 13830 1727204115.87498: variable 'ansible_distribution_major_version' from source: facts 13830 1727204115.87517: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204115.87754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204115.90300: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204115.90371: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204115.90489: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204115.90552: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204115.90582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204115.90668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204115.90698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204115.90723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204115.90772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204115.90788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204115.90841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204115.90874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204115.90899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204115.90939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204115.91007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204115.91217: variable '__network_required_facts' from source: role '' defaults 13830 1727204115.91234: variable 'ansible_facts' from source: unknown 13830 1727204115.92072: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13830 1727204115.92080: when evaluation is False, skipping this task 13830 1727204115.92087: _execute() done 13830 1727204115.92094: dumping result to json 13830 1727204115.92100: done dumping result, returning 13830 1727204115.92111: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1659-6b02-000000000b10] 13830 1727204115.92123: sending task result for task 0affcd87-79f5-1659-6b02-000000000b10 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204115.92277: no more pending results, returning what we have 13830 1727204115.92282: results queue empty 13830 1727204115.92282: checking for any_errors_fatal 13830 1727204115.92284: done checking for any_errors_fatal 13830 1727204115.92285: checking for max_fail_percentage 13830 1727204115.92287: done checking for max_fail_percentage 13830 1727204115.92288: checking to see if all hosts have failed and the running result is not ok 13830 1727204115.92289: done checking to see if all hosts have failed 13830 1727204115.92290: getting the remaining hosts for this loop 13830 1727204115.92291: done getting the remaining hosts for this loop 13830 1727204115.92296: getting the next task for host managed-node3 13830 1727204115.92308: done getting next task for host managed-node3 13830 1727204115.92312: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204115.92318: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204115.92344: getting variables 13830 1727204115.92346: in VariableManager get_vars() 13830 1727204115.92400: Calling all_inventory to load vars for managed-node3 13830 1727204115.92403: Calling groups_inventory to load vars for managed-node3 13830 1727204115.92406: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204115.92418: Calling all_plugins_play to load vars for managed-node3 13830 1727204115.92421: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204115.92425: Calling groups_plugins_play to load vars for managed-node3 13830 1727204115.93812: done sending task result for task 0affcd87-79f5-1659-6b02-000000000b10 13830 1727204115.93824: WORKER PROCESS EXITING 13830 1727204115.96119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204116.00099: done with get_vars() 13830 1727204116.00131: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.160) 0:00:49.081 ***** 13830 1727204116.00359: entering _queue_task() for managed-node3/stat 13830 1727204116.01113: worker is 1 (out of 1 available) 13830 1727204116.01129: exiting _queue_task() for managed-node3/stat 13830 1727204116.01145: done queuing things up, now waiting for results queue to drain 13830 1727204116.01147: waiting for pending results... 13830 1727204116.01985: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204116.02298: in run() - task 0affcd87-79f5-1659-6b02-000000000b12 13830 1727204116.02314: variable 'ansible_search_path' from source: unknown 13830 1727204116.02317: variable 'ansible_search_path' from source: unknown 13830 1727204116.02520: calling self._execute() 13830 1727204116.02780: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204116.02792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204116.02811: variable 'omit' from source: magic vars 13830 1727204116.03553: variable 'ansible_distribution_major_version' from source: facts 13830 1727204116.03710: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204116.03995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204116.04646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204116.04717: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204116.04815: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204116.04910: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204116.05086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204116.05241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204116.05275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204116.05304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204116.05527: variable '__network_is_ostree' from source: set_fact 13830 1727204116.05659: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204116.05669: when evaluation is False, skipping this task 13830 1727204116.05676: _execute() done 13830 1727204116.05682: dumping result to json 13830 1727204116.05689: done dumping result, returning 13830 1727204116.05700: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1659-6b02-000000000b12] 13830 1727204116.05709: sending task result for task 0affcd87-79f5-1659-6b02-000000000b12 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204116.05875: no more pending results, returning what we have 13830 1727204116.05880: results queue empty 13830 1727204116.05880: checking for any_errors_fatal 13830 1727204116.05891: done checking for any_errors_fatal 13830 1727204116.05892: checking for max_fail_percentage 13830 1727204116.05894: done checking for max_fail_percentage 13830 1727204116.05895: checking to see if all hosts have failed and the running result is not ok 13830 1727204116.05896: done checking to see if all hosts have failed 13830 1727204116.05896: getting the remaining hosts for this loop 13830 1727204116.05898: done getting the remaining hosts for this loop 13830 1727204116.05903: getting the next task for host managed-node3 13830 1727204116.05912: done getting next task for host managed-node3 13830 1727204116.05915: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204116.05921: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204116.05948: getting variables 13830 1727204116.05950: in VariableManager get_vars() 13830 1727204116.06000: Calling all_inventory to load vars for managed-node3 13830 1727204116.06003: Calling groups_inventory to load vars for managed-node3 13830 1727204116.06006: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204116.06017: Calling all_plugins_play to load vars for managed-node3 13830 1727204116.06020: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204116.06023: Calling groups_plugins_play to load vars for managed-node3 13830 1727204116.07160: done sending task result for task 0affcd87-79f5-1659-6b02-000000000b12 13830 1727204116.07165: WORKER PROCESS EXITING 13830 1727204116.09052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204116.12904: done with get_vars() 13830 1727204116.12943: done getting variables 13830 1727204116.13126: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.128) 0:00:49.209 ***** 13830 1727204116.13178: entering _queue_task() for managed-node3/set_fact 13830 1727204116.14008: worker is 1 (out of 1 available) 13830 1727204116.14022: exiting _queue_task() for managed-node3/set_fact 13830 1727204116.14036: done queuing things up, now waiting for results queue to drain 13830 1727204116.14038: waiting for pending results... 13830 1727204116.14897: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204116.15325: in run() - task 0affcd87-79f5-1659-6b02-000000000b13 13830 1727204116.15340: variable 'ansible_search_path' from source: unknown 13830 1727204116.15344: variable 'ansible_search_path' from source: unknown 13830 1727204116.15383: calling self._execute() 13830 1727204116.15729: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204116.15737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204116.15746: variable 'omit' from source: magic vars 13830 1727204116.16438: variable 'ansible_distribution_major_version' from source: facts 13830 1727204116.16448: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204116.16847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204116.17517: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204116.17558: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204116.17709: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204116.17741: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204116.17861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204116.17887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204116.18040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204116.18067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204116.18319: variable '__network_is_ostree' from source: set_fact 13830 1727204116.18327: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204116.18445: when evaluation is False, skipping this task 13830 1727204116.18449: _execute() done 13830 1727204116.18452: dumping result to json 13830 1727204116.18454: done dumping result, returning 13830 1727204116.18463: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1659-6b02-000000000b13] 13830 1727204116.18469: sending task result for task 0affcd87-79f5-1659-6b02-000000000b13 13830 1727204116.18579: done sending task result for task 0affcd87-79f5-1659-6b02-000000000b13 13830 1727204116.18584: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204116.18645: no more pending results, returning what we have 13830 1727204116.18649: results queue empty 13830 1727204116.18650: checking for any_errors_fatal 13830 1727204116.18658: done checking for any_errors_fatal 13830 1727204116.18659: checking for max_fail_percentage 13830 1727204116.18660: done checking for max_fail_percentage 13830 1727204116.18661: checking to see if all hosts have failed and the running result is not ok 13830 1727204116.18662: done checking to see if all hosts have failed 13830 1727204116.18662: getting the remaining hosts for this loop 13830 1727204116.18666: done getting the remaining hosts for this loop 13830 1727204116.18670: getting the next task for host managed-node3 13830 1727204116.18681: done getting next task for host managed-node3 13830 1727204116.18684: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204116.18690: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204116.18714: getting variables 13830 1727204116.18716: in VariableManager get_vars() 13830 1727204116.18762: Calling all_inventory to load vars for managed-node3 13830 1727204116.18767: Calling groups_inventory to load vars for managed-node3 13830 1727204116.18770: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204116.18781: Calling all_plugins_play to load vars for managed-node3 13830 1727204116.18784: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204116.18787: Calling groups_plugins_play to load vars for managed-node3 13830 1727204116.21297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204116.23695: done with get_vars() 13830 1727204116.23731: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.107) 0:00:49.317 ***** 13830 1727204116.23958: entering _queue_task() for managed-node3/service_facts 13830 1727204116.24987: worker is 1 (out of 1 available) 13830 1727204116.25002: exiting _queue_task() for managed-node3/service_facts 13830 1727204116.25014: done queuing things up, now waiting for results queue to drain 13830 1727204116.25016: waiting for pending results... 13830 1727204116.25886: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204116.26059: in run() - task 0affcd87-79f5-1659-6b02-000000000b15 13830 1727204116.26084: variable 'ansible_search_path' from source: unknown 13830 1727204116.26092: variable 'ansible_search_path' from source: unknown 13830 1727204116.26140: calling self._execute() 13830 1727204116.26250: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204116.26260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204116.26276: variable 'omit' from source: magic vars 13830 1727204116.26667: variable 'ansible_distribution_major_version' from source: facts 13830 1727204116.26685: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204116.26702: variable 'omit' from source: magic vars 13830 1727204116.26863: variable 'omit' from source: magic vars 13830 1727204116.26920: variable 'omit' from source: magic vars 13830 1727204116.26976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204116.27023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204116.27052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204116.27078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204116.27098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204116.27138: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204116.27146: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204116.27153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204116.27271: Set connection var ansible_connection to ssh 13830 1727204116.27286: Set connection var ansible_timeout to 10 13830 1727204116.27296: Set connection var ansible_shell_executable to /bin/sh 13830 1727204116.27301: Set connection var ansible_shell_type to sh 13830 1727204116.27309: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204116.27335: Set connection var ansible_pipelining to False 13830 1727204116.27369: variable 'ansible_shell_executable' from source: unknown 13830 1727204116.27377: variable 'ansible_connection' from source: unknown 13830 1727204116.27384: variable 'ansible_module_compression' from source: unknown 13830 1727204116.27390: variable 'ansible_shell_type' from source: unknown 13830 1727204116.27395: variable 'ansible_shell_executable' from source: unknown 13830 1727204116.27401: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204116.27406: variable 'ansible_pipelining' from source: unknown 13830 1727204116.27412: variable 'ansible_timeout' from source: unknown 13830 1727204116.27418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204116.27668: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204116.27688: variable 'omit' from source: magic vars 13830 1727204116.27698: starting attempt loop 13830 1727204116.27708: running the handler 13830 1727204116.27724: _low_level_execute_command(): starting 13830 1727204116.27738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204116.28751: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204116.28888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.28905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204116.28922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204116.28970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204116.28989: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204116.29002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.29020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204116.29030: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204116.29044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204116.29055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.29071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204116.29091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204116.29102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204116.29112: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204116.29125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.29288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204116.29310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204116.29325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204116.29541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204116.31184: stdout chunk (state=3): >>>/root <<< 13830 1727204116.31394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204116.31398: stdout chunk (state=3): >>><<< 13830 1727204116.31400: stderr chunk (state=3): >>><<< 13830 1727204116.31528: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204116.31532: _low_level_execute_command(): starting 13830 1727204116.31535: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748 `" && echo ansible-tmp-1727204116.31423-17506-249144535622748="` echo /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748 `" ) && sleep 0' 13830 1727204116.32757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.32762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204116.32859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204116.32865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.32868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204116.32871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.33020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204116.33091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204116.33095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204116.33160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204116.35131: stdout chunk (state=3): >>>ansible-tmp-1727204116.31423-17506-249144535622748=/root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748 <<< 13830 1727204116.35230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204116.35315: stderr chunk (state=3): >>><<< 13830 1727204116.35319: stdout chunk (state=3): >>><<< 13830 1727204116.35329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204116.31423-17506-249144535622748=/root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204116.35389: variable 'ansible_module_compression' from source: unknown 13830 1727204116.35438: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13830 1727204116.35478: variable 'ansible_facts' from source: unknown 13830 1727204116.35596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748/AnsiballZ_service_facts.py 13830 1727204116.36345: Sending initial data 13830 1727204116.36349: Sent initial data (160 bytes) 13830 1727204116.40872: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204116.40877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204116.40923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.40927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204116.40941: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.40946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204116.40960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204116.40971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.41050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204116.41068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204116.41084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204116.41250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204116.42961: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204116.43015: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204116.43045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpb69e_04w /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748/AnsiballZ_service_facts.py <<< 13830 1727204116.43092: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204116.44606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204116.44876: stderr chunk (state=3): >>><<< 13830 1727204116.44880: stdout chunk (state=3): >>><<< 13830 1727204116.44882: done transferring module to remote 13830 1727204116.44885: _low_level_execute_command(): starting 13830 1727204116.44887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748/ /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748/AnsiballZ_service_facts.py && sleep 0' 13830 1727204116.45611: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.45615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204116.45649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204116.45655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204116.45663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.45720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204116.45880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204116.45887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204116.45944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204116.47730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204116.47739: stdout chunk (state=3): >>><<< 13830 1727204116.47741: stderr chunk (state=3): >>><<< 13830 1727204116.47787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204116.47791: _low_level_execute_command(): starting 13830 1727204116.47794: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748/AnsiballZ_service_facts.py && sleep 0' 13830 1727204116.49274: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204116.49280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.49285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204116.49287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204116.49365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204116.49371: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204116.49382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.49397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204116.49405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204116.49411: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204116.49437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204116.49447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204116.49459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204116.49468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204116.49475: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204116.49485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204116.49557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204116.49577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204116.49581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204116.49870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204117.87384: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 13830 1727204117.87430: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 13830 1727204117.87437: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name<<< 13830 1727204117.87441: stdout chunk (state=3): >>>": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed<<< 13830 1727204117.87444: stdout chunk (state=3): >>>.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13830 1727204117.88937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204117.88941: stderr chunk (state=3): >>><<< 13830 1727204117.88944: stdout chunk (state=3): >>><<< 13830 1727204117.88981: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204117.89784: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204117.89792: _low_level_execute_command(): starting 13830 1727204117.89797: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204116.31423-17506-249144535622748/ > /dev/null 2>&1 && sleep 0' 13830 1727204117.91531: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204117.91539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204117.91698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204117.91702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204117.91716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204117.91721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204117.91843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204117.91995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204117.91999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204117.92078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204117.94034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204117.94038: stderr chunk (state=3): >>><<< 13830 1727204117.94040: stdout chunk (state=3): >>><<< 13830 1727204117.94059: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204117.94067: handler run complete 13830 1727204117.94245: variable 'ansible_facts' from source: unknown 13830 1727204117.94398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204117.95498: variable 'ansible_facts' from source: unknown 13830 1727204117.95906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204117.96102: attempt loop complete, returning result 13830 1727204117.96107: _execute() done 13830 1727204117.96110: dumping result to json 13830 1727204117.96170: done dumping result, returning 13830 1727204117.96180: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1659-6b02-000000000b15] 13830 1727204117.96186: sending task result for task 0affcd87-79f5-1659-6b02-000000000b15 13830 1727204117.96953: done sending task result for task 0affcd87-79f5-1659-6b02-000000000b15 13830 1727204117.96956: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204117.97030: no more pending results, returning what we have 13830 1727204117.97033: results queue empty 13830 1727204117.97034: checking for any_errors_fatal 13830 1727204117.97038: done checking for any_errors_fatal 13830 1727204117.97039: checking for max_fail_percentage 13830 1727204117.97041: done checking for max_fail_percentage 13830 1727204117.97042: checking to see if all hosts have failed and the running result is not ok 13830 1727204117.97042: done checking to see if all hosts have failed 13830 1727204117.97043: getting the remaining hosts for this loop 13830 1727204117.97045: done getting the remaining hosts for this loop 13830 1727204117.97048: getting the next task for host managed-node3 13830 1727204117.97055: done getting next task for host managed-node3 13830 1727204117.97059: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204117.97066: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204117.97078: getting variables 13830 1727204117.97080: in VariableManager get_vars() 13830 1727204117.97120: Calling all_inventory to load vars for managed-node3 13830 1727204117.97123: Calling groups_inventory to load vars for managed-node3 13830 1727204117.97126: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204117.97135: Calling all_plugins_play to load vars for managed-node3 13830 1727204117.97137: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204117.97146: Calling groups_plugins_play to load vars for managed-node3 13830 1727204117.99859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204118.03544: done with get_vars() 13830 1727204118.03704: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:18 -0400 (0:00:01.799) 0:00:51.117 ***** 13830 1727204118.03931: entering _queue_task() for managed-node3/package_facts 13830 1727204118.04638: worker is 1 (out of 1 available) 13830 1727204118.04652: exiting _queue_task() for managed-node3/package_facts 13830 1727204118.04775: done queuing things up, now waiting for results queue to drain 13830 1727204118.04778: waiting for pending results... 13830 1727204118.05141: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204118.05332: in run() - task 0affcd87-79f5-1659-6b02-000000000b16 13830 1727204118.05352: variable 'ansible_search_path' from source: unknown 13830 1727204118.05358: variable 'ansible_search_path' from source: unknown 13830 1727204118.05400: calling self._execute() 13830 1727204118.05511: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204118.05522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204118.05548: variable 'omit' from source: magic vars 13830 1727204118.05993: variable 'ansible_distribution_major_version' from source: facts 13830 1727204118.06012: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204118.06023: variable 'omit' from source: magic vars 13830 1727204118.06116: variable 'omit' from source: magic vars 13830 1727204118.06152: variable 'omit' from source: magic vars 13830 1727204118.06207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204118.06248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204118.06278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204118.06307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204118.06324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204118.06359: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204118.06370: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204118.06378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204118.06485: Set connection var ansible_connection to ssh 13830 1727204118.06502: Set connection var ansible_timeout to 10 13830 1727204118.06520: Set connection var ansible_shell_executable to /bin/sh 13830 1727204118.06526: Set connection var ansible_shell_type to sh 13830 1727204118.06536: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204118.06549: Set connection var ansible_pipelining to False 13830 1727204118.06578: variable 'ansible_shell_executable' from source: unknown 13830 1727204118.06586: variable 'ansible_connection' from source: unknown 13830 1727204118.06593: variable 'ansible_module_compression' from source: unknown 13830 1727204118.06599: variable 'ansible_shell_type' from source: unknown 13830 1727204118.06605: variable 'ansible_shell_executable' from source: unknown 13830 1727204118.06612: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204118.06625: variable 'ansible_pipelining' from source: unknown 13830 1727204118.06633: variable 'ansible_timeout' from source: unknown 13830 1727204118.06642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204118.06941: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204118.06974: variable 'omit' from source: magic vars 13830 1727204118.06982: starting attempt loop 13830 1727204118.06988: running the handler 13830 1727204118.07004: _low_level_execute_command(): starting 13830 1727204118.07029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204118.07856: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204118.07876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.07892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.07912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.07962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.07979: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204118.07994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.08015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204118.08028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204118.08040: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204118.08058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.08078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.08097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.08111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.08123: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204118.08139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.08222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204118.08246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204118.08266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204118.08345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204118.09990: stdout chunk (state=3): >>>/root <<< 13830 1727204118.10180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204118.10184: stdout chunk (state=3): >>><<< 13830 1727204118.10186: stderr chunk (state=3): >>><<< 13830 1727204118.10308: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204118.10311: _low_level_execute_command(): starting 13830 1727204118.10315: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203 `" && echo ansible-tmp-1727204118.1020777-17625-255121810990203="` echo /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203 `" ) && sleep 0' 13830 1727204118.11121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.11125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.11169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.11172: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.11175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.11177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.11242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204118.11257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204118.11350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204118.13241: stdout chunk (state=3): >>>ansible-tmp-1727204118.1020777-17625-255121810990203=/root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203 <<< 13830 1727204118.13389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204118.13451: stderr chunk (state=3): >>><<< 13830 1727204118.13455: stdout chunk (state=3): >>><<< 13830 1727204118.13675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204118.1020777-17625-255121810990203=/root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204118.13679: variable 'ansible_module_compression' from source: unknown 13830 1727204118.13681: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13830 1727204118.13683: variable 'ansible_facts' from source: unknown 13830 1727204118.13856: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203/AnsiballZ_package_facts.py 13830 1727204118.14800: Sending initial data 13830 1727204118.14803: Sent initial data (162 bytes) 13830 1727204118.16154: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.16158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.16200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204118.16205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.16207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204118.16210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.16260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204118.16789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204118.16792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204118.16851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204118.18580: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204118.18623: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204118.18658: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmplzdt2tcr /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203/AnsiballZ_package_facts.py <<< 13830 1727204118.18695: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204118.21816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204118.21971: stderr chunk (state=3): >>><<< 13830 1727204118.21974: stdout chunk (state=3): >>><<< 13830 1727204118.22084: done transferring module to remote 13830 1727204118.22088: _low_level_execute_command(): starting 13830 1727204118.22090: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203/ /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203/AnsiballZ_package_facts.py && sleep 0' 13830 1727204118.23741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204118.23759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.23780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.23802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.23857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.23876: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204118.23891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.23910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204118.23938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204118.23950: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204118.23963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.23981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.23997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.24009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.24022: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204118.24040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.24129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204118.24146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204118.24170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204118.24384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204118.26430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204118.26434: stdout chunk (state=3): >>><<< 13830 1727204118.26436: stderr chunk (state=3): >>><<< 13830 1727204118.26544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204118.26548: _low_level_execute_command(): starting 13830 1727204118.26551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203/AnsiballZ_package_facts.py && sleep 0' 13830 1727204118.28298: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204118.28316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.28329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.28346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.28393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.28409: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204118.28423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.28439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204118.28449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204118.28457: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204118.28468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.28480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.28499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.28509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.28526: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204118.28539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.28683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204118.28702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204118.28719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204118.28828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204118.75806: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "versi<<< 13830 1727204118.75824: stdout chunk (state=3): >>>on": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", <<< 13830 1727204118.75832: stdout chunk (state=3): >>>"epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "re<<< 13830 1727204118.75876: stdout chunk (state=3): >>>lease": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noa<<< 13830 1727204118.75891: stdout chunk (state=3): >>>rch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch"<<< 13830 1727204118.75930: stdout chunk (state=3): >>>: "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "releas<<< 13830 1727204118.75938: stdout chunk (state=3): >>>e": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"na<<< 13830 1727204118.75941: stdout chunk (state=3): >>>me": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt<<< 13830 1727204118.75984: stdout chunk (state=3): >>>-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nf<<< 13830 1727204118.75990: stdout chunk (state=3): >>>s-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "e<<< 13830 1727204118.75994: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13830 1727204118.78188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204118.78262: stderr chunk (state=3): >>><<< 13830 1727204118.78269: stdout chunk (state=3): >>><<< 13830 1727204118.78482: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204118.85996: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204118.86027: _low_level_execute_command(): starting 13830 1727204118.86039: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204118.1020777-17625-255121810990203/ > /dev/null 2>&1 && sleep 0' 13830 1727204118.86699: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204118.86716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.86731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.86749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.86791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.86801: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204118.86815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.86831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204118.86842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204118.86851: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204118.86860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204118.86877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204118.86890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204118.86900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204118.86910: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204118.86922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204118.86996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204118.87012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204118.87026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204118.87133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204118.89049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204118.89137: stderr chunk (state=3): >>><<< 13830 1727204118.89140: stdout chunk (state=3): >>><<< 13830 1727204118.89746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204118.89752: handler run complete 13830 1727204118.90377: variable 'ansible_facts' from source: unknown 13830 1727204118.90876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204118.92962: variable 'ansible_facts' from source: unknown 13830 1727204118.93450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204118.94221: attempt loop complete, returning result 13830 1727204118.94243: _execute() done 13830 1727204118.94252: dumping result to json 13830 1727204118.94482: done dumping result, returning 13830 1727204118.94498: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1659-6b02-000000000b16] 13830 1727204118.94508: sending task result for task 0affcd87-79f5-1659-6b02-000000000b16 13830 1727204119.02421: done sending task result for task 0affcd87-79f5-1659-6b02-000000000b16 13830 1727204119.02425: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204119.02537: no more pending results, returning what we have 13830 1727204119.02539: results queue empty 13830 1727204119.02540: checking for any_errors_fatal 13830 1727204119.02544: done checking for any_errors_fatal 13830 1727204119.02545: checking for max_fail_percentage 13830 1727204119.02546: done checking for max_fail_percentage 13830 1727204119.02547: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.02547: done checking to see if all hosts have failed 13830 1727204119.02548: getting the remaining hosts for this loop 13830 1727204119.02549: done getting the remaining hosts for this loop 13830 1727204119.02552: getting the next task for host managed-node3 13830 1727204119.02558: done getting next task for host managed-node3 13830 1727204119.02561: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204119.02568: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.02578: getting variables 13830 1727204119.02579: in VariableManager get_vars() 13830 1727204119.02602: Calling all_inventory to load vars for managed-node3 13830 1727204119.02604: Calling groups_inventory to load vars for managed-node3 13830 1727204119.02611: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.02617: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.02619: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.02622: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.04363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.06204: done with get_vars() 13830 1727204119.06234: done getting variables 13830 1727204119.06292: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:19 -0400 (0:00:01.023) 0:00:52.141 ***** 13830 1727204119.06331: entering _queue_task() for managed-node3/debug 13830 1727204119.06695: worker is 1 (out of 1 available) 13830 1727204119.06708: exiting _queue_task() for managed-node3/debug 13830 1727204119.06721: done queuing things up, now waiting for results queue to drain 13830 1727204119.06722: waiting for pending results... 13830 1727204119.07958: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204119.08125: in run() - task 0affcd87-79f5-1659-6b02-000000000a2f 13830 1727204119.08265: variable 'ansible_search_path' from source: unknown 13830 1727204119.08274: variable 'ansible_search_path' from source: unknown 13830 1727204119.08315: calling self._execute() 13830 1727204119.08449: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.08581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.08596: variable 'omit' from source: magic vars 13830 1727204119.09429: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.09454: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.09561: variable 'omit' from source: magic vars 13830 1727204119.09645: variable 'omit' from source: magic vars 13830 1727204119.09886: variable 'network_provider' from source: set_fact 13830 1727204119.10013: variable 'omit' from source: magic vars 13830 1727204119.10066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204119.10112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204119.10230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204119.10254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204119.10275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204119.10311: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204119.10432: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.10441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.10662: Set connection var ansible_connection to ssh 13830 1727204119.10680: Set connection var ansible_timeout to 10 13830 1727204119.10688: Set connection var ansible_shell_executable to /bin/sh 13830 1727204119.10693: Set connection var ansible_shell_type to sh 13830 1727204119.10699: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204119.10709: Set connection var ansible_pipelining to False 13830 1727204119.10734: variable 'ansible_shell_executable' from source: unknown 13830 1727204119.10742: variable 'ansible_connection' from source: unknown 13830 1727204119.10751: variable 'ansible_module_compression' from source: unknown 13830 1727204119.10757: variable 'ansible_shell_type' from source: unknown 13830 1727204119.10763: variable 'ansible_shell_executable' from source: unknown 13830 1727204119.10771: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.10777: variable 'ansible_pipelining' from source: unknown 13830 1727204119.10782: variable 'ansible_timeout' from source: unknown 13830 1727204119.10788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.11037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204119.11099: variable 'omit' from source: magic vars 13830 1727204119.11179: starting attempt loop 13830 1727204119.11191: running the handler 13830 1727204119.11245: handler run complete 13830 1727204119.11317: attempt loop complete, returning result 13830 1727204119.11324: _execute() done 13830 1727204119.11331: dumping result to json 13830 1727204119.11337: done dumping result, returning 13830 1727204119.11350: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1659-6b02-000000000a2f] 13830 1727204119.11359: sending task result for task 0affcd87-79f5-1659-6b02-000000000a2f 13830 1727204119.11602: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a2f 13830 1727204119.11611: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 13830 1727204119.11694: no more pending results, returning what we have 13830 1727204119.11698: results queue empty 13830 1727204119.11699: checking for any_errors_fatal 13830 1727204119.11713: done checking for any_errors_fatal 13830 1727204119.11714: checking for max_fail_percentage 13830 1727204119.11716: done checking for max_fail_percentage 13830 1727204119.11717: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.11717: done checking to see if all hosts have failed 13830 1727204119.11718: getting the remaining hosts for this loop 13830 1727204119.11720: done getting the remaining hosts for this loop 13830 1727204119.11726: getting the next task for host managed-node3 13830 1727204119.11734: done getting next task for host managed-node3 13830 1727204119.11739: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204119.11745: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.11760: getting variables 13830 1727204119.11762: in VariableManager get_vars() 13830 1727204119.11814: Calling all_inventory to load vars for managed-node3 13830 1727204119.11818: Calling groups_inventory to load vars for managed-node3 13830 1727204119.11820: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.11831: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.11834: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.11837: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.14107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.16794: done with get_vars() 13830 1727204119.16833: done getting variables 13830 1727204119.16897: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.106) 0:00:52.247 ***** 13830 1727204119.16942: entering _queue_task() for managed-node3/fail 13830 1727204119.17309: worker is 1 (out of 1 available) 13830 1727204119.17324: exiting _queue_task() for managed-node3/fail 13830 1727204119.17337: done queuing things up, now waiting for results queue to drain 13830 1727204119.17339: waiting for pending results... 13830 1727204119.18471: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204119.19068: in run() - task 0affcd87-79f5-1659-6b02-000000000a30 13830 1727204119.19084: variable 'ansible_search_path' from source: unknown 13830 1727204119.19088: variable 'ansible_search_path' from source: unknown 13830 1727204119.19147: calling self._execute() 13830 1727204119.19333: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.19456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.19468: variable 'omit' from source: magic vars 13830 1727204119.20275: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.20288: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.20725: variable 'network_state' from source: role '' defaults 13830 1727204119.20729: Evaluated conditional (network_state != {}): False 13830 1727204119.20731: when evaluation is False, skipping this task 13830 1727204119.20733: _execute() done 13830 1727204119.20735: dumping result to json 13830 1727204119.20826: done dumping result, returning 13830 1727204119.20834: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1659-6b02-000000000a30] 13830 1727204119.20844: sending task result for task 0affcd87-79f5-1659-6b02-000000000a30 13830 1727204119.20949: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a30 13830 1727204119.20953: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204119.21003: no more pending results, returning what we have 13830 1727204119.21007: results queue empty 13830 1727204119.21007: checking for any_errors_fatal 13830 1727204119.21015: done checking for any_errors_fatal 13830 1727204119.21016: checking for max_fail_percentage 13830 1727204119.21018: done checking for max_fail_percentage 13830 1727204119.21018: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.21019: done checking to see if all hosts have failed 13830 1727204119.21020: getting the remaining hosts for this loop 13830 1727204119.21022: done getting the remaining hosts for this loop 13830 1727204119.21026: getting the next task for host managed-node3 13830 1727204119.21034: done getting next task for host managed-node3 13830 1727204119.21038: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204119.21043: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.21068: getting variables 13830 1727204119.21069: in VariableManager get_vars() 13830 1727204119.21114: Calling all_inventory to load vars for managed-node3 13830 1727204119.21116: Calling groups_inventory to load vars for managed-node3 13830 1727204119.21118: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.21130: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.21132: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.21135: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.24260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.27247: done with get_vars() 13830 1727204119.27280: done getting variables 13830 1727204119.27339: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.104) 0:00:52.351 ***** 13830 1727204119.27378: entering _queue_task() for managed-node3/fail 13830 1727204119.28357: worker is 1 (out of 1 available) 13830 1727204119.28372: exiting _queue_task() for managed-node3/fail 13830 1727204119.28386: done queuing things up, now waiting for results queue to drain 13830 1727204119.28387: waiting for pending results... 13830 1727204119.28707: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204119.28904: in run() - task 0affcd87-79f5-1659-6b02-000000000a31 13830 1727204119.28926: variable 'ansible_search_path' from source: unknown 13830 1727204119.28935: variable 'ansible_search_path' from source: unknown 13830 1727204119.28986: calling self._execute() 13830 1727204119.29101: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.29112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.29126: variable 'omit' from source: magic vars 13830 1727204119.29532: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.29551: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.29687: variable 'network_state' from source: role '' defaults 13830 1727204119.29704: Evaluated conditional (network_state != {}): False 13830 1727204119.29717: when evaluation is False, skipping this task 13830 1727204119.29725: _execute() done 13830 1727204119.29732: dumping result to json 13830 1727204119.29739: done dumping result, returning 13830 1727204119.29750: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1659-6b02-000000000a31] 13830 1727204119.29762: sending task result for task 0affcd87-79f5-1659-6b02-000000000a31 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204119.29930: no more pending results, returning what we have 13830 1727204119.29935: results queue empty 13830 1727204119.29935: checking for any_errors_fatal 13830 1727204119.29942: done checking for any_errors_fatal 13830 1727204119.29943: checking for max_fail_percentage 13830 1727204119.29944: done checking for max_fail_percentage 13830 1727204119.29945: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.29946: done checking to see if all hosts have failed 13830 1727204119.29947: getting the remaining hosts for this loop 13830 1727204119.29948: done getting the remaining hosts for this loop 13830 1727204119.29952: getting the next task for host managed-node3 13830 1727204119.29961: done getting next task for host managed-node3 13830 1727204119.29966: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204119.29972: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.29995: getting variables 13830 1727204119.29997: in VariableManager get_vars() 13830 1727204119.30042: Calling all_inventory to load vars for managed-node3 13830 1727204119.30045: Calling groups_inventory to load vars for managed-node3 13830 1727204119.30047: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.30058: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.30060: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.30062: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.31129: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a31 13830 1727204119.31135: WORKER PROCESS EXITING 13830 1727204119.31851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.33615: done with get_vars() 13830 1727204119.33657: done getting variables 13830 1727204119.33724: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.063) 0:00:52.415 ***** 13830 1727204119.33771: entering _queue_task() for managed-node3/fail 13830 1727204119.34971: worker is 1 (out of 1 available) 13830 1727204119.34985: exiting _queue_task() for managed-node3/fail 13830 1727204119.34997: done queuing things up, now waiting for results queue to drain 13830 1727204119.34998: waiting for pending results... 13830 1727204119.35971: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204119.36307: in run() - task 0affcd87-79f5-1659-6b02-000000000a32 13830 1727204119.36324: variable 'ansible_search_path' from source: unknown 13830 1727204119.36328: variable 'ansible_search_path' from source: unknown 13830 1727204119.36370: calling self._execute() 13830 1727204119.36583: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.36587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.36597: variable 'omit' from source: magic vars 13830 1727204119.37431: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.37446: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.37860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204119.43362: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204119.43431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204119.43472: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204119.43511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204119.43541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204119.43623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.43655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.43683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.43722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.43745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.43847: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.43862: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13830 1727204119.43867: when evaluation is False, skipping this task 13830 1727204119.43870: _execute() done 13830 1727204119.43872: dumping result to json 13830 1727204119.43875: done dumping result, returning 13830 1727204119.43886: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1659-6b02-000000000a32] 13830 1727204119.43889: sending task result for task 0affcd87-79f5-1659-6b02-000000000a32 13830 1727204119.43988: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a32 13830 1727204119.43991: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13830 1727204119.44043: no more pending results, returning what we have 13830 1727204119.44048: results queue empty 13830 1727204119.44048: checking for any_errors_fatal 13830 1727204119.44056: done checking for any_errors_fatal 13830 1727204119.44057: checking for max_fail_percentage 13830 1727204119.44059: done checking for max_fail_percentage 13830 1727204119.44060: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.44060: done checking to see if all hosts have failed 13830 1727204119.44061: getting the remaining hosts for this loop 13830 1727204119.44063: done getting the remaining hosts for this loop 13830 1727204119.44069: getting the next task for host managed-node3 13830 1727204119.44077: done getting next task for host managed-node3 13830 1727204119.44081: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204119.44086: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.44108: getting variables 13830 1727204119.44109: in VariableManager get_vars() 13830 1727204119.44154: Calling all_inventory to load vars for managed-node3 13830 1727204119.44158: Calling groups_inventory to load vars for managed-node3 13830 1727204119.44160: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.44171: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.44173: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.44176: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.46799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.48499: done with get_vars() 13830 1727204119.48531: done getting variables 13830 1727204119.48599: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.148) 0:00:52.564 ***** 13830 1727204119.48637: entering _queue_task() for managed-node3/dnf 13830 1727204119.48998: worker is 1 (out of 1 available) 13830 1727204119.49009: exiting _queue_task() for managed-node3/dnf 13830 1727204119.49020: done queuing things up, now waiting for results queue to drain 13830 1727204119.49022: waiting for pending results... 13830 1727204119.49335: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204119.49503: in run() - task 0affcd87-79f5-1659-6b02-000000000a33 13830 1727204119.49524: variable 'ansible_search_path' from source: unknown 13830 1727204119.49537: variable 'ansible_search_path' from source: unknown 13830 1727204119.49583: calling self._execute() 13830 1727204119.49690: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.49703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.49717: variable 'omit' from source: magic vars 13830 1727204119.50133: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.50152: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.50370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204119.53831: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204119.54040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204119.54097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204119.54138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204119.54190: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204119.54271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.54328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.54362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.54420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.54438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.54574: variable 'ansible_distribution' from source: facts 13830 1727204119.54584: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.54608: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13830 1727204119.54746: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204119.54993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.55021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.55049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.55099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.55116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.55154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.55188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.55218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.55261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.55293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.55337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.55419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.55533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.55580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.55646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.55955: variable 'network_connections' from source: task vars 13830 1727204119.55975: variable 'controller_profile' from source: play vars 13830 1727204119.56050: variable 'controller_profile' from source: play vars 13830 1727204119.56068: variable 'controller_device' from source: play vars 13830 1727204119.56140: variable 'controller_device' from source: play vars 13830 1727204119.56154: variable 'dhcp_interface1' from source: play vars 13830 1727204119.56221: variable 'dhcp_interface1' from source: play vars 13830 1727204119.56244: variable 'port1_profile' from source: play vars 13830 1727204119.56307: variable 'port1_profile' from source: play vars 13830 1727204119.56320: variable 'dhcp_interface1' from source: play vars 13830 1727204119.56390: variable 'dhcp_interface1' from source: play vars 13830 1727204119.56459: variable 'controller_profile' from source: play vars 13830 1727204119.56525: variable 'controller_profile' from source: play vars 13830 1727204119.56680: variable 'port2_profile' from source: play vars 13830 1727204119.56743: variable 'port2_profile' from source: play vars 13830 1727204119.56755: variable 'dhcp_interface2' from source: play vars 13830 1727204119.56832: variable 'dhcp_interface2' from source: play vars 13830 1727204119.57003: variable 'controller_profile' from source: play vars 13830 1727204119.57067: variable 'controller_profile' from source: play vars 13830 1727204119.57339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204119.57755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204119.57809: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204119.57893: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204119.58000: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204119.58234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204119.58309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204119.58411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.58445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204119.58631: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204119.59056: variable 'network_connections' from source: task vars 13830 1727204119.59181: variable 'controller_profile' from source: play vars 13830 1727204119.59261: variable 'controller_profile' from source: play vars 13830 1727204119.59338: variable 'controller_device' from source: play vars 13830 1727204119.59462: variable 'controller_device' from source: play vars 13830 1727204119.59511: variable 'dhcp_interface1' from source: play vars 13830 1727204119.59700: variable 'dhcp_interface1' from source: play vars 13830 1727204119.59719: variable 'port1_profile' from source: play vars 13830 1727204119.59787: variable 'port1_profile' from source: play vars 13830 1727204119.59801: variable 'dhcp_interface1' from source: play vars 13830 1727204119.59873: variable 'dhcp_interface1' from source: play vars 13830 1727204119.59885: variable 'controller_profile' from source: play vars 13830 1727204119.59956: variable 'controller_profile' from source: play vars 13830 1727204119.59974: variable 'port2_profile' from source: play vars 13830 1727204119.60057: variable 'port2_profile' from source: play vars 13830 1727204119.60071: variable 'dhcp_interface2' from source: play vars 13830 1727204119.60245: variable 'dhcp_interface2' from source: play vars 13830 1727204119.60256: variable 'controller_profile' from source: play vars 13830 1727204119.60320: variable 'controller_profile' from source: play vars 13830 1727204119.60372: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204119.60381: when evaluation is False, skipping this task 13830 1727204119.60388: _execute() done 13830 1727204119.60394: dumping result to json 13830 1727204119.60402: done dumping result, returning 13830 1727204119.60415: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000a33] 13830 1727204119.60424: sending task result for task 0affcd87-79f5-1659-6b02-000000000a33 13830 1727204119.60555: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a33 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204119.60613: no more pending results, returning what we have 13830 1727204119.60617: results queue empty 13830 1727204119.60618: checking for any_errors_fatal 13830 1727204119.60627: done checking for any_errors_fatal 13830 1727204119.60628: checking for max_fail_percentage 13830 1727204119.60630: done checking for max_fail_percentage 13830 1727204119.60631: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.60632: done checking to see if all hosts have failed 13830 1727204119.60632: getting the remaining hosts for this loop 13830 1727204119.60634: done getting the remaining hosts for this loop 13830 1727204119.60638: getting the next task for host managed-node3 13830 1727204119.60647: done getting next task for host managed-node3 13830 1727204119.60653: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204119.60659: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.60685: getting variables 13830 1727204119.60688: in VariableManager get_vars() 13830 1727204119.60735: Calling all_inventory to load vars for managed-node3 13830 1727204119.60739: Calling groups_inventory to load vars for managed-node3 13830 1727204119.60741: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.60753: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.60755: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.60758: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.61707: WORKER PROCESS EXITING 13830 1727204119.62622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.64351: done with get_vars() 13830 1727204119.64384: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204119.64462: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.158) 0:00:52.723 ***** 13830 1727204119.64499: entering _queue_task() for managed-node3/yum 13830 1727204119.64853: worker is 1 (out of 1 available) 13830 1727204119.64867: exiting _queue_task() for managed-node3/yum 13830 1727204119.64884: done queuing things up, now waiting for results queue to drain 13830 1727204119.64886: waiting for pending results... 13830 1727204119.65189: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204119.65359: in run() - task 0affcd87-79f5-1659-6b02-000000000a34 13830 1727204119.65381: variable 'ansible_search_path' from source: unknown 13830 1727204119.65388: variable 'ansible_search_path' from source: unknown 13830 1727204119.65438: calling self._execute() 13830 1727204119.65540: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.65552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.65569: variable 'omit' from source: magic vars 13830 1727204119.65960: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.65987: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.66162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204119.69004: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204119.69087: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204119.69140: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204119.69180: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204119.69212: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204119.69301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.69331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.69371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.69417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.69435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.69535: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.69562: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13830 1727204119.69575: when evaluation is False, skipping this task 13830 1727204119.69583: _execute() done 13830 1727204119.69591: dumping result to json 13830 1727204119.69599: done dumping result, returning 13830 1727204119.69612: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000a34] 13830 1727204119.69623: sending task result for task 0affcd87-79f5-1659-6b02-000000000a34 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13830 1727204119.69794: no more pending results, returning what we have 13830 1727204119.69798: results queue empty 13830 1727204119.69799: checking for any_errors_fatal 13830 1727204119.69807: done checking for any_errors_fatal 13830 1727204119.69808: checking for max_fail_percentage 13830 1727204119.69811: done checking for max_fail_percentage 13830 1727204119.69812: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.69813: done checking to see if all hosts have failed 13830 1727204119.69814: getting the remaining hosts for this loop 13830 1727204119.69816: done getting the remaining hosts for this loop 13830 1727204119.69820: getting the next task for host managed-node3 13830 1727204119.69829: done getting next task for host managed-node3 13830 1727204119.69833: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204119.69839: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.69865: getting variables 13830 1727204119.69868: in VariableManager get_vars() 13830 1727204119.69916: Calling all_inventory to load vars for managed-node3 13830 1727204119.69919: Calling groups_inventory to load vars for managed-node3 13830 1727204119.69922: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.69933: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.69936: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.69939: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.70918: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a34 13830 1727204119.70921: WORKER PROCESS EXITING 13830 1727204119.72769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.76553: done with get_vars() 13830 1727204119.76593: done getting variables 13830 1727204119.77424: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.129) 0:00:52.852 ***** 13830 1727204119.77468: entering _queue_task() for managed-node3/fail 13830 1727204119.77984: worker is 1 (out of 1 available) 13830 1727204119.77997: exiting _queue_task() for managed-node3/fail 13830 1727204119.78009: done queuing things up, now waiting for results queue to drain 13830 1727204119.78011: waiting for pending results... 13830 1727204119.78325: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204119.78495: in run() - task 0affcd87-79f5-1659-6b02-000000000a35 13830 1727204119.78509: variable 'ansible_search_path' from source: unknown 13830 1727204119.78512: variable 'ansible_search_path' from source: unknown 13830 1727204119.78557: calling self._execute() 13830 1727204119.78662: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.78667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.78685: variable 'omit' from source: magic vars 13830 1727204119.79099: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.79114: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.79254: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204119.79467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204119.82029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204119.82107: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204119.82145: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204119.82188: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204119.82214: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204119.82300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.82347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.82379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.82424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.82441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.82496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.82521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.82548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.82595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.82614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.82658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.82683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.82716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.82759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.82776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.82983: variable 'network_connections' from source: task vars 13830 1727204119.82994: variable 'controller_profile' from source: play vars 13830 1727204119.83073: variable 'controller_profile' from source: play vars 13830 1727204119.83083: variable 'controller_device' from source: play vars 13830 1727204119.83155: variable 'controller_device' from source: play vars 13830 1727204119.83165: variable 'dhcp_interface1' from source: play vars 13830 1727204119.83228: variable 'dhcp_interface1' from source: play vars 13830 1727204119.83243: variable 'port1_profile' from source: play vars 13830 1727204119.83305: variable 'port1_profile' from source: play vars 13830 1727204119.83311: variable 'dhcp_interface1' from source: play vars 13830 1727204119.83383: variable 'dhcp_interface1' from source: play vars 13830 1727204119.83389: variable 'controller_profile' from source: play vars 13830 1727204119.83453: variable 'controller_profile' from source: play vars 13830 1727204119.83465: variable 'port2_profile' from source: play vars 13830 1727204119.83525: variable 'port2_profile' from source: play vars 13830 1727204119.83531: variable 'dhcp_interface2' from source: play vars 13830 1727204119.83600: variable 'dhcp_interface2' from source: play vars 13830 1727204119.83607: variable 'controller_profile' from source: play vars 13830 1727204119.83668: variable 'controller_profile' from source: play vars 13830 1727204119.83743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204119.83928: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204119.83969: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204119.83999: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204119.84035: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204119.84082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204119.84103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204119.84135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.84163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204119.84225: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204119.84501: variable 'network_connections' from source: task vars 13830 1727204119.84504: variable 'controller_profile' from source: play vars 13830 1727204119.84576: variable 'controller_profile' from source: play vars 13830 1727204119.84583: variable 'controller_device' from source: play vars 13830 1727204119.84635: variable 'controller_device' from source: play vars 13830 1727204119.84646: variable 'dhcp_interface1' from source: play vars 13830 1727204119.84712: variable 'dhcp_interface1' from source: play vars 13830 1727204119.84722: variable 'port1_profile' from source: play vars 13830 1727204119.84790: variable 'port1_profile' from source: play vars 13830 1727204119.84796: variable 'dhcp_interface1' from source: play vars 13830 1727204119.84856: variable 'dhcp_interface1' from source: play vars 13830 1727204119.84863: variable 'controller_profile' from source: play vars 13830 1727204119.84929: variable 'controller_profile' from source: play vars 13830 1727204119.84936: variable 'port2_profile' from source: play vars 13830 1727204119.85002: variable 'port2_profile' from source: play vars 13830 1727204119.85008: variable 'dhcp_interface2' from source: play vars 13830 1727204119.85068: variable 'dhcp_interface2' from source: play vars 13830 1727204119.85075: variable 'controller_profile' from source: play vars 13830 1727204119.85143: variable 'controller_profile' from source: play vars 13830 1727204119.85178: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204119.85181: when evaluation is False, skipping this task 13830 1727204119.85183: _execute() done 13830 1727204119.85186: dumping result to json 13830 1727204119.85188: done dumping result, returning 13830 1727204119.85202: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000a35] 13830 1727204119.85212: sending task result for task 0affcd87-79f5-1659-6b02-000000000a35 13830 1727204119.85316: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a35 13830 1727204119.85319: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204119.85379: no more pending results, returning what we have 13830 1727204119.85383: results queue empty 13830 1727204119.85384: checking for any_errors_fatal 13830 1727204119.85392: done checking for any_errors_fatal 13830 1727204119.85393: checking for max_fail_percentage 13830 1727204119.85395: done checking for max_fail_percentage 13830 1727204119.85396: checking to see if all hosts have failed and the running result is not ok 13830 1727204119.85397: done checking to see if all hosts have failed 13830 1727204119.85398: getting the remaining hosts for this loop 13830 1727204119.85400: done getting the remaining hosts for this loop 13830 1727204119.85404: getting the next task for host managed-node3 13830 1727204119.85413: done getting next task for host managed-node3 13830 1727204119.85417: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13830 1727204119.85424: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204119.85451: getting variables 13830 1727204119.85453: in VariableManager get_vars() 13830 1727204119.85507: Calling all_inventory to load vars for managed-node3 13830 1727204119.85510: Calling groups_inventory to load vars for managed-node3 13830 1727204119.85513: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204119.85524: Calling all_plugins_play to load vars for managed-node3 13830 1727204119.85527: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204119.85531: Calling groups_plugins_play to load vars for managed-node3 13830 1727204119.88171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204119.90313: done with get_vars() 13830 1727204119.90348: done getting variables 13830 1727204119.90417: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:19 -0400 (0:00:00.129) 0:00:52.982 ***** 13830 1727204119.90465: entering _queue_task() for managed-node3/package 13830 1727204119.90831: worker is 1 (out of 1 available) 13830 1727204119.90843: exiting _queue_task() for managed-node3/package 13830 1727204119.90856: done queuing things up, now waiting for results queue to drain 13830 1727204119.90858: waiting for pending results... 13830 1727204119.91176: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 13830 1727204119.91350: in run() - task 0affcd87-79f5-1659-6b02-000000000a36 13830 1727204119.91375: variable 'ansible_search_path' from source: unknown 13830 1727204119.91383: variable 'ansible_search_path' from source: unknown 13830 1727204119.91434: calling self._execute() 13830 1727204119.91546: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204119.91556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204119.91570: variable 'omit' from source: magic vars 13830 1727204119.92104: variable 'ansible_distribution_major_version' from source: facts 13830 1727204119.92123: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204119.92336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204119.92631: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204119.92686: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204119.92730: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204119.92812: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204119.92943: variable 'network_packages' from source: role '' defaults 13830 1727204119.93089: variable '__network_provider_setup' from source: role '' defaults 13830 1727204119.93118: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204119.93379: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204119.93392: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204119.93453: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204119.93657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204119.96898: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204119.96979: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204119.97022: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204119.97059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204119.97101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204119.97192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.97454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.97493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.97542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.97566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.97622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.97651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.97693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.97738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.97758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.98048: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204119.98183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.98210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.98246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.98292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.98310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.98417: variable 'ansible_python' from source: facts 13830 1727204119.98448: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204119.98539: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204119.98644: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204119.98792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.98823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.98852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.98906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.98923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.98976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204119.99019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204119.99049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204119.99096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204119.99121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204119.99280: variable 'network_connections' from source: task vars 13830 1727204119.99293: variable 'controller_profile' from source: play vars 13830 1727204119.99418: variable 'controller_profile' from source: play vars 13830 1727204119.99439: variable 'controller_device' from source: play vars 13830 1727204119.99548: variable 'controller_device' from source: play vars 13830 1727204119.99569: variable 'dhcp_interface1' from source: play vars 13830 1727204119.99665: variable 'dhcp_interface1' from source: play vars 13830 1727204119.99684: variable 'port1_profile' from source: play vars 13830 1727204119.99794: variable 'port1_profile' from source: play vars 13830 1727204119.99810: variable 'dhcp_interface1' from source: play vars 13830 1727204119.99962: variable 'dhcp_interface1' from source: play vars 13830 1727204119.99982: variable 'controller_profile' from source: play vars 13830 1727204120.00100: variable 'controller_profile' from source: play vars 13830 1727204120.00115: variable 'port2_profile' from source: play vars 13830 1727204120.00235: variable 'port2_profile' from source: play vars 13830 1727204120.00251: variable 'dhcp_interface2' from source: play vars 13830 1727204120.00371: variable 'dhcp_interface2' from source: play vars 13830 1727204120.00387: variable 'controller_profile' from source: play vars 13830 1727204120.00503: variable 'controller_profile' from source: play vars 13830 1727204120.00605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204120.00646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204120.00690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.00728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204120.00796: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204120.01126: variable 'network_connections' from source: task vars 13830 1727204120.01137: variable 'controller_profile' from source: play vars 13830 1727204120.01254: variable 'controller_profile' from source: play vars 13830 1727204120.01273: variable 'controller_device' from source: play vars 13830 1727204120.01387: variable 'controller_device' from source: play vars 13830 1727204120.01406: variable 'dhcp_interface1' from source: play vars 13830 1727204120.01487: variable 'dhcp_interface1' from source: play vars 13830 1727204120.01509: variable 'port1_profile' from source: play vars 13830 1727204120.01624: variable 'port1_profile' from source: play vars 13830 1727204120.01639: variable 'dhcp_interface1' from source: play vars 13830 1727204120.01755: variable 'dhcp_interface1' from source: play vars 13830 1727204120.01778: variable 'controller_profile' from source: play vars 13830 1727204120.01892: variable 'controller_profile' from source: play vars 13830 1727204120.01907: variable 'port2_profile' from source: play vars 13830 1727204120.02027: variable 'port2_profile' from source: play vars 13830 1727204120.02043: variable 'dhcp_interface2' from source: play vars 13830 1727204120.02160: variable 'dhcp_interface2' from source: play vars 13830 1727204120.02178: variable 'controller_profile' from source: play vars 13830 1727204120.02292: variable 'controller_profile' from source: play vars 13830 1727204120.02354: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204120.02444: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204120.02783: variable 'network_connections' from source: task vars 13830 1727204120.02793: variable 'controller_profile' from source: play vars 13830 1727204120.02870: variable 'controller_profile' from source: play vars 13830 1727204120.02881: variable 'controller_device' from source: play vars 13830 1727204120.02951: variable 'controller_device' from source: play vars 13830 1727204120.02970: variable 'dhcp_interface1' from source: play vars 13830 1727204120.03038: variable 'dhcp_interface1' from source: play vars 13830 1727204120.03049: variable 'port1_profile' from source: play vars 13830 1727204120.03117: variable 'port1_profile' from source: play vars 13830 1727204120.03129: variable 'dhcp_interface1' from source: play vars 13830 1727204120.03201: variable 'dhcp_interface1' from source: play vars 13830 1727204120.03215: variable 'controller_profile' from source: play vars 13830 1727204120.03288: variable 'controller_profile' from source: play vars 13830 1727204120.03300: variable 'port2_profile' from source: play vars 13830 1727204120.03372: variable 'port2_profile' from source: play vars 13830 1727204120.03386: variable 'dhcp_interface2' from source: play vars 13830 1727204120.03454: variable 'dhcp_interface2' from source: play vars 13830 1727204120.03472: variable 'controller_profile' from source: play vars 13830 1727204120.03542: variable 'controller_profile' from source: play vars 13830 1727204120.03580: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204120.03668: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204120.04024: variable 'network_connections' from source: task vars 13830 1727204120.04035: variable 'controller_profile' from source: play vars 13830 1727204120.04112: variable 'controller_profile' from source: play vars 13830 1727204120.04127: variable 'controller_device' from source: play vars 13830 1727204120.04197: variable 'controller_device' from source: play vars 13830 1727204120.04211: variable 'dhcp_interface1' from source: play vars 13830 1727204120.04291: variable 'dhcp_interface1' from source: play vars 13830 1727204120.04305: variable 'port1_profile' from source: play vars 13830 1727204120.04379: variable 'port1_profile' from source: play vars 13830 1727204120.04392: variable 'dhcp_interface1' from source: play vars 13830 1727204120.04463: variable 'dhcp_interface1' from source: play vars 13830 1727204120.04481: variable 'controller_profile' from source: play vars 13830 1727204120.04548: variable 'controller_profile' from source: play vars 13830 1727204120.04563: variable 'port2_profile' from source: play vars 13830 1727204120.04633: variable 'port2_profile' from source: play vars 13830 1727204120.04645: variable 'dhcp_interface2' from source: play vars 13830 1727204120.04721: variable 'dhcp_interface2' from source: play vars 13830 1727204120.04732: variable 'controller_profile' from source: play vars 13830 1727204120.04803: variable 'controller_profile' from source: play vars 13830 1727204120.04883: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204120.04950: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204120.04970: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204120.05034: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204120.05246: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204120.05814: variable 'network_connections' from source: task vars 13830 1727204120.05824: variable 'controller_profile' from source: play vars 13830 1727204120.05895: variable 'controller_profile' from source: play vars 13830 1727204120.05910: variable 'controller_device' from source: play vars 13830 1727204120.05978: variable 'controller_device' from source: play vars 13830 1727204120.05990: variable 'dhcp_interface1' from source: play vars 13830 1727204120.06056: variable 'dhcp_interface1' from source: play vars 13830 1727204120.06075: variable 'port1_profile' from source: play vars 13830 1727204120.06142: variable 'port1_profile' from source: play vars 13830 1727204120.06154: variable 'dhcp_interface1' from source: play vars 13830 1727204120.06220: variable 'dhcp_interface1' from source: play vars 13830 1727204120.06235: variable 'controller_profile' from source: play vars 13830 1727204120.06293: variable 'controller_profile' from source: play vars 13830 1727204120.06304: variable 'port2_profile' from source: play vars 13830 1727204120.06363: variable 'port2_profile' from source: play vars 13830 1727204120.06379: variable 'dhcp_interface2' from source: play vars 13830 1727204120.06454: variable 'dhcp_interface2' from source: play vars 13830 1727204120.06468: variable 'controller_profile' from source: play vars 13830 1727204120.06538: variable 'controller_profile' from source: play vars 13830 1727204120.06557: variable 'ansible_distribution' from source: facts 13830 1727204120.06573: variable '__network_rh_distros' from source: role '' defaults 13830 1727204120.06584: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.06621: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204120.06807: variable 'ansible_distribution' from source: facts 13830 1727204120.06819: variable '__network_rh_distros' from source: role '' defaults 13830 1727204120.06830: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.06852: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204120.07035: variable 'ansible_distribution' from source: facts 13830 1727204120.07045: variable '__network_rh_distros' from source: role '' defaults 13830 1727204120.07060: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.07109: variable 'network_provider' from source: set_fact 13830 1727204120.07131: variable 'ansible_facts' from source: unknown 13830 1727204120.07964: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13830 1727204120.07978: when evaluation is False, skipping this task 13830 1727204120.07984: _execute() done 13830 1727204120.07992: dumping result to json 13830 1727204120.07998: done dumping result, returning 13830 1727204120.08011: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1659-6b02-000000000a36] 13830 1727204120.08023: sending task result for task 0affcd87-79f5-1659-6b02-000000000a36 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13830 1727204120.08199: no more pending results, returning what we have 13830 1727204120.08204: results queue empty 13830 1727204120.08205: checking for any_errors_fatal 13830 1727204120.08214: done checking for any_errors_fatal 13830 1727204120.08215: checking for max_fail_percentage 13830 1727204120.08217: done checking for max_fail_percentage 13830 1727204120.08218: checking to see if all hosts have failed and the running result is not ok 13830 1727204120.08219: done checking to see if all hosts have failed 13830 1727204120.08219: getting the remaining hosts for this loop 13830 1727204120.08221: done getting the remaining hosts for this loop 13830 1727204120.08226: getting the next task for host managed-node3 13830 1727204120.08235: done getting next task for host managed-node3 13830 1727204120.08239: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204120.08244: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204120.08271: getting variables 13830 1727204120.08273: in VariableManager get_vars() 13830 1727204120.08321: Calling all_inventory to load vars for managed-node3 13830 1727204120.08324: Calling groups_inventory to load vars for managed-node3 13830 1727204120.08327: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204120.08338: Calling all_plugins_play to load vars for managed-node3 13830 1727204120.08341: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204120.08344: Calling groups_plugins_play to load vars for managed-node3 13830 1727204120.09325: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a36 13830 1727204120.09329: WORKER PROCESS EXITING 13830 1727204120.10159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204120.11907: done with get_vars() 13830 1727204120.11946: done getting variables 13830 1727204120.12012: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.215) 0:00:53.198 ***** 13830 1727204120.12061: entering _queue_task() for managed-node3/package 13830 1727204120.12411: worker is 1 (out of 1 available) 13830 1727204120.12424: exiting _queue_task() for managed-node3/package 13830 1727204120.12436: done queuing things up, now waiting for results queue to drain 13830 1727204120.12438: waiting for pending results... 13830 1727204120.12745: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204120.12938: in run() - task 0affcd87-79f5-1659-6b02-000000000a37 13830 1727204120.12960: variable 'ansible_search_path' from source: unknown 13830 1727204120.12973: variable 'ansible_search_path' from source: unknown 13830 1727204120.13020: calling self._execute() 13830 1727204120.13133: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204120.13150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204120.13167: variable 'omit' from source: magic vars 13830 1727204120.13568: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.13591: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204120.13725: variable 'network_state' from source: role '' defaults 13830 1727204120.13741: Evaluated conditional (network_state != {}): False 13830 1727204120.13755: when evaluation is False, skipping this task 13830 1727204120.13762: _execute() done 13830 1727204120.13776: dumping result to json 13830 1727204120.13783: done dumping result, returning 13830 1727204120.13795: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1659-6b02-000000000a37] 13830 1727204120.13808: sending task result for task 0affcd87-79f5-1659-6b02-000000000a37 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204120.13975: no more pending results, returning what we have 13830 1727204120.13981: results queue empty 13830 1727204120.13982: checking for any_errors_fatal 13830 1727204120.13989: done checking for any_errors_fatal 13830 1727204120.13990: checking for max_fail_percentage 13830 1727204120.13992: done checking for max_fail_percentage 13830 1727204120.13993: checking to see if all hosts have failed and the running result is not ok 13830 1727204120.13994: done checking to see if all hosts have failed 13830 1727204120.13994: getting the remaining hosts for this loop 13830 1727204120.13996: done getting the remaining hosts for this loop 13830 1727204120.14001: getting the next task for host managed-node3 13830 1727204120.14010: done getting next task for host managed-node3 13830 1727204120.14015: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204120.14020: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204120.14047: getting variables 13830 1727204120.14049: in VariableManager get_vars() 13830 1727204120.14104: Calling all_inventory to load vars for managed-node3 13830 1727204120.14107: Calling groups_inventory to load vars for managed-node3 13830 1727204120.14110: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204120.14122: Calling all_plugins_play to load vars for managed-node3 13830 1727204120.14125: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204120.14128: Calling groups_plugins_play to load vars for managed-node3 13830 1727204120.15124: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a37 13830 1727204120.15127: WORKER PROCESS EXITING 13830 1727204120.16176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204120.17918: done with get_vars() 13830 1727204120.17958: done getting variables 13830 1727204120.18038: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.060) 0:00:53.258 ***** 13830 1727204120.18080: entering _queue_task() for managed-node3/package 13830 1727204120.18436: worker is 1 (out of 1 available) 13830 1727204120.18448: exiting _queue_task() for managed-node3/package 13830 1727204120.18460: done queuing things up, now waiting for results queue to drain 13830 1727204120.18461: waiting for pending results... 13830 1727204120.19579: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204120.19922: in run() - task 0affcd87-79f5-1659-6b02-000000000a38 13830 1727204120.19985: variable 'ansible_search_path' from source: unknown 13830 1727204120.20076: variable 'ansible_search_path' from source: unknown 13830 1727204120.20115: calling self._execute() 13830 1727204120.20327: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204120.20338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204120.20353: variable 'omit' from source: magic vars 13830 1727204120.21101: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.21177: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204120.21414: variable 'network_state' from source: role '' defaults 13830 1727204120.21494: Evaluated conditional (network_state != {}): False 13830 1727204120.21502: when evaluation is False, skipping this task 13830 1727204120.21509: _execute() done 13830 1727204120.21516: dumping result to json 13830 1727204120.21523: done dumping result, returning 13830 1727204120.21537: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1659-6b02-000000000a38] 13830 1727204120.21549: sending task result for task 0affcd87-79f5-1659-6b02-000000000a38 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204120.21747: no more pending results, returning what we have 13830 1727204120.21752: results queue empty 13830 1727204120.21753: checking for any_errors_fatal 13830 1727204120.21766: done checking for any_errors_fatal 13830 1727204120.21767: checking for max_fail_percentage 13830 1727204120.21769: done checking for max_fail_percentage 13830 1727204120.21770: checking to see if all hosts have failed and the running result is not ok 13830 1727204120.21771: done checking to see if all hosts have failed 13830 1727204120.21772: getting the remaining hosts for this loop 13830 1727204120.21773: done getting the remaining hosts for this loop 13830 1727204120.21778: getting the next task for host managed-node3 13830 1727204120.21788: done getting next task for host managed-node3 13830 1727204120.21793: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204120.21799: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204120.21823: getting variables 13830 1727204120.21826: in VariableManager get_vars() 13830 1727204120.21875: Calling all_inventory to load vars for managed-node3 13830 1727204120.21878: Calling groups_inventory to load vars for managed-node3 13830 1727204120.21881: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204120.21893: Calling all_plugins_play to load vars for managed-node3 13830 1727204120.21896: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204120.21899: Calling groups_plugins_play to load vars for managed-node3 13830 1727204120.23773: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a38 13830 1727204120.23777: WORKER PROCESS EXITING 13830 1727204120.24539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204120.28525: done with get_vars() 13830 1727204120.28563: done getting variables 13830 1727204120.28628: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.105) 0:00:53.364 ***** 13830 1727204120.28671: entering _queue_task() for managed-node3/service 13830 1727204120.29420: worker is 1 (out of 1 available) 13830 1727204120.29433: exiting _queue_task() for managed-node3/service 13830 1727204120.29446: done queuing things up, now waiting for results queue to drain 13830 1727204120.29448: waiting for pending results... 13830 1727204120.30060: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204120.30412: in run() - task 0affcd87-79f5-1659-6b02-000000000a39 13830 1727204120.30509: variable 'ansible_search_path' from source: unknown 13830 1727204120.30514: variable 'ansible_search_path' from source: unknown 13830 1727204120.30660: calling self._execute() 13830 1727204120.30876: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204120.30880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204120.30893: variable 'omit' from source: magic vars 13830 1727204120.31656: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.31672: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204120.32043: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204120.32589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204120.38309: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204120.38379: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204120.38421: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204120.38457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204120.38486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204120.38585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.38624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.38651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.38692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.38725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.38777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.38814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.38916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.39604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.39618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.39661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.39688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.39711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.39754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.39768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.40251: variable 'network_connections' from source: task vars 13830 1727204120.40265: variable 'controller_profile' from source: play vars 13830 1727204120.40410: variable 'controller_profile' from source: play vars 13830 1727204120.40420: variable 'controller_device' from source: play vars 13830 1727204120.40486: variable 'controller_device' from source: play vars 13830 1727204120.40495: variable 'dhcp_interface1' from source: play vars 13830 1727204120.40557: variable 'dhcp_interface1' from source: play vars 13830 1727204120.40566: variable 'port1_profile' from source: play vars 13830 1727204120.40628: variable 'port1_profile' from source: play vars 13830 1727204120.40637: variable 'dhcp_interface1' from source: play vars 13830 1727204120.40700: variable 'dhcp_interface1' from source: play vars 13830 1727204120.40706: variable 'controller_profile' from source: play vars 13830 1727204120.40766: variable 'controller_profile' from source: play vars 13830 1727204120.40773: variable 'port2_profile' from source: play vars 13830 1727204120.40840: variable 'port2_profile' from source: play vars 13830 1727204120.40846: variable 'dhcp_interface2' from source: play vars 13830 1727204120.40909: variable 'dhcp_interface2' from source: play vars 13830 1727204120.40915: variable 'controller_profile' from source: play vars 13830 1727204120.40997: variable 'controller_profile' from source: play vars 13830 1727204120.41085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204120.41269: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204120.41310: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204120.41345: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204120.41377: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204120.41423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204120.41457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204120.41484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.41510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204120.41588: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204120.41834: variable 'network_connections' from source: task vars 13830 1727204120.41843: variable 'controller_profile' from source: play vars 13830 1727204120.41909: variable 'controller_profile' from source: play vars 13830 1727204120.41916: variable 'controller_device' from source: play vars 13830 1727204120.41977: variable 'controller_device' from source: play vars 13830 1727204120.41984: variable 'dhcp_interface1' from source: play vars 13830 1727204120.42041: variable 'dhcp_interface1' from source: play vars 13830 1727204120.42048: variable 'port1_profile' from source: play vars 13830 1727204120.42108: variable 'port1_profile' from source: play vars 13830 1727204120.42115: variable 'dhcp_interface1' from source: play vars 13830 1727204120.42178: variable 'dhcp_interface1' from source: play vars 13830 1727204120.42184: variable 'controller_profile' from source: play vars 13830 1727204120.42242: variable 'controller_profile' from source: play vars 13830 1727204120.42248: variable 'port2_profile' from source: play vars 13830 1727204120.42307: variable 'port2_profile' from source: play vars 13830 1727204120.42318: variable 'dhcp_interface2' from source: play vars 13830 1727204120.42382: variable 'dhcp_interface2' from source: play vars 13830 1727204120.42388: variable 'controller_profile' from source: play vars 13830 1727204120.42452: variable 'controller_profile' from source: play vars 13830 1727204120.42491: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204120.42494: when evaluation is False, skipping this task 13830 1727204120.42497: _execute() done 13830 1727204120.42500: dumping result to json 13830 1727204120.42502: done dumping result, returning 13830 1727204120.42512: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000a39] 13830 1727204120.42514: sending task result for task 0affcd87-79f5-1659-6b02-000000000a39 13830 1727204120.42628: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a39 13830 1727204120.42630: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204120.42679: no more pending results, returning what we have 13830 1727204120.42682: results queue empty 13830 1727204120.42683: checking for any_errors_fatal 13830 1727204120.42689: done checking for any_errors_fatal 13830 1727204120.42690: checking for max_fail_percentage 13830 1727204120.42692: done checking for max_fail_percentage 13830 1727204120.42692: checking to see if all hosts have failed and the running result is not ok 13830 1727204120.42693: done checking to see if all hosts have failed 13830 1727204120.42694: getting the remaining hosts for this loop 13830 1727204120.42695: done getting the remaining hosts for this loop 13830 1727204120.42699: getting the next task for host managed-node3 13830 1727204120.42707: done getting next task for host managed-node3 13830 1727204120.42711: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204120.42716: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204120.42737: getting variables 13830 1727204120.42739: in VariableManager get_vars() 13830 1727204120.42789: Calling all_inventory to load vars for managed-node3 13830 1727204120.42793: Calling groups_inventory to load vars for managed-node3 13830 1727204120.42796: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204120.42806: Calling all_plugins_play to load vars for managed-node3 13830 1727204120.42809: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204120.42813: Calling groups_plugins_play to load vars for managed-node3 13830 1727204120.45744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204120.47945: done with get_vars() 13830 1727204120.47990: done getting variables 13830 1727204120.49177: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:20 -0400 (0:00:00.206) 0:00:53.570 ***** 13830 1727204120.49287: entering _queue_task() for managed-node3/service 13830 1727204120.49776: worker is 1 (out of 1 available) 13830 1727204120.49790: exiting _queue_task() for managed-node3/service 13830 1727204120.49803: done queuing things up, now waiting for results queue to drain 13830 1727204120.49805: waiting for pending results... 13830 1727204120.50141: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204120.50314: in run() - task 0affcd87-79f5-1659-6b02-000000000a3a 13830 1727204120.50326: variable 'ansible_search_path' from source: unknown 13830 1727204120.50329: variable 'ansible_search_path' from source: unknown 13830 1727204120.50377: calling self._execute() 13830 1727204120.50484: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204120.50491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204120.50503: variable 'omit' from source: magic vars 13830 1727204120.50907: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.50924: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204120.51096: variable 'network_provider' from source: set_fact 13830 1727204120.51100: variable 'network_state' from source: role '' defaults 13830 1727204120.51111: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13830 1727204120.51117: variable 'omit' from source: magic vars 13830 1727204120.51199: variable 'omit' from source: magic vars 13830 1727204120.51228: variable 'network_service_name' from source: role '' defaults 13830 1727204120.51304: variable 'network_service_name' from source: role '' defaults 13830 1727204120.51469: variable '__network_provider_setup' from source: role '' defaults 13830 1727204120.51475: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204120.52018: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204120.52028: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204120.52101: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204120.52332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204120.55891: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204120.55983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204120.56019: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204120.56075: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204120.56103: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204120.56200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.56228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.56262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.56305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.56319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.56375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.56400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.56425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.56473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.56503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.56762: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204120.56897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.56925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.56953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.56995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.57009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.57116: variable 'ansible_python' from source: facts 13830 1727204120.57140: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204120.57226: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204120.57315: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204120.57454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.57481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.57644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.58553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.58577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.58626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204120.58657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204120.58683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.58724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204120.58739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204120.59069: variable 'network_connections' from source: task vars 13830 1727204120.59154: variable 'controller_profile' from source: play vars 13830 1727204120.59171: variable 'controller_profile' from source: play vars 13830 1727204120.59184: variable 'controller_device' from source: play vars 13830 1727204120.59258: variable 'controller_device' from source: play vars 13830 1727204120.59276: variable 'dhcp_interface1' from source: play vars 13830 1727204120.59342: variable 'dhcp_interface1' from source: play vars 13830 1727204120.59356: variable 'port1_profile' from source: play vars 13830 1727204120.59463: variable 'port1_profile' from source: play vars 13830 1727204120.59481: variable 'dhcp_interface1' from source: play vars 13830 1727204120.59553: variable 'dhcp_interface1' from source: play vars 13830 1727204120.59563: variable 'controller_profile' from source: play vars 13830 1727204120.59643: variable 'controller_profile' from source: play vars 13830 1727204120.59654: variable 'port2_profile' from source: play vars 13830 1727204120.59734: variable 'port2_profile' from source: play vars 13830 1727204120.59748: variable 'dhcp_interface2' from source: play vars 13830 1727204120.59826: variable 'dhcp_interface2' from source: play vars 13830 1727204120.59839: variable 'controller_profile' from source: play vars 13830 1727204120.59913: variable 'controller_profile' from source: play vars 13830 1727204120.60026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204120.60223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204120.60282: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204120.60339: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204120.60386: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204120.60450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204120.60487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204120.60520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204120.60557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204120.60609: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204120.60980: variable 'network_connections' from source: task vars 13830 1727204120.60986: variable 'controller_profile' from source: play vars 13830 1727204120.61218: variable 'controller_profile' from source: play vars 13830 1727204120.61252: variable 'controller_device' from source: play vars 13830 1727204120.61339: variable 'controller_device' from source: play vars 13830 1727204120.61377: variable 'dhcp_interface1' from source: play vars 13830 1727204120.61481: variable 'dhcp_interface1' from source: play vars 13830 1727204120.61500: variable 'port1_profile' from source: play vars 13830 1727204120.61665: variable 'port1_profile' from source: play vars 13830 1727204120.61709: variable 'dhcp_interface1' from source: play vars 13830 1727204120.61782: variable 'dhcp_interface1' from source: play vars 13830 1727204120.61793: variable 'controller_profile' from source: play vars 13830 1727204120.61985: variable 'controller_profile' from source: play vars 13830 1727204120.61996: variable 'port2_profile' from source: play vars 13830 1727204120.62259: variable 'port2_profile' from source: play vars 13830 1727204120.62273: variable 'dhcp_interface2' from source: play vars 13830 1727204120.62345: variable 'dhcp_interface2' from source: play vars 13830 1727204120.62475: variable 'controller_profile' from source: play vars 13830 1727204120.62606: variable 'controller_profile' from source: play vars 13830 1727204120.62971: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204120.62975: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204120.63429: variable 'network_connections' from source: task vars 13830 1727204120.63433: variable 'controller_profile' from source: play vars 13830 1727204120.63540: variable 'controller_profile' from source: play vars 13830 1727204120.63548: variable 'controller_device' from source: play vars 13830 1727204120.63839: variable 'controller_device' from source: play vars 13830 1727204120.63847: variable 'dhcp_interface1' from source: play vars 13830 1727204120.63926: variable 'dhcp_interface1' from source: play vars 13830 1727204120.64283: variable 'port1_profile' from source: play vars 13830 1727204120.64358: variable 'port1_profile' from source: play vars 13830 1727204120.64365: variable 'dhcp_interface1' from source: play vars 13830 1727204120.64437: variable 'dhcp_interface1' from source: play vars 13830 1727204120.64445: variable 'controller_profile' from source: play vars 13830 1727204120.64518: variable 'controller_profile' from source: play vars 13830 1727204120.64524: variable 'port2_profile' from source: play vars 13830 1727204120.64592: variable 'port2_profile' from source: play vars 13830 1727204120.64599: variable 'dhcp_interface2' from source: play vars 13830 1727204120.64669: variable 'dhcp_interface2' from source: play vars 13830 1727204120.64676: variable 'controller_profile' from source: play vars 13830 1727204120.64748: variable 'controller_profile' from source: play vars 13830 1727204120.64777: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204120.64856: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204120.65159: variable 'network_connections' from source: task vars 13830 1727204120.65765: variable 'controller_profile' from source: play vars 13830 1727204120.65847: variable 'controller_profile' from source: play vars 13830 1727204120.65853: variable 'controller_device' from source: play vars 13830 1727204120.65938: variable 'controller_device' from source: play vars 13830 1727204120.65946: variable 'dhcp_interface1' from source: play vars 13830 1727204120.66036: variable 'dhcp_interface1' from source: play vars 13830 1727204120.66047: variable 'port1_profile' from source: play vars 13830 1727204120.66117: variable 'port1_profile' from source: play vars 13830 1727204120.66123: variable 'dhcp_interface1' from source: play vars 13830 1727204120.66196: variable 'dhcp_interface1' from source: play vars 13830 1727204120.66204: variable 'controller_profile' from source: play vars 13830 1727204120.66275: variable 'controller_profile' from source: play vars 13830 1727204120.66283: variable 'port2_profile' from source: play vars 13830 1727204120.66350: variable 'port2_profile' from source: play vars 13830 1727204120.66357: variable 'dhcp_interface2' from source: play vars 13830 1727204120.66430: variable 'dhcp_interface2' from source: play vars 13830 1727204120.66434: variable 'controller_profile' from source: play vars 13830 1727204120.67047: variable 'controller_profile' from source: play vars 13830 1727204120.67114: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204120.67176: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204120.67182: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204120.67246: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204120.67746: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204120.68252: variable 'network_connections' from source: task vars 13830 1727204120.68255: variable 'controller_profile' from source: play vars 13830 1727204120.68406: variable 'controller_profile' from source: play vars 13830 1727204120.68413: variable 'controller_device' from source: play vars 13830 1727204120.68474: variable 'controller_device' from source: play vars 13830 1727204120.68481: variable 'dhcp_interface1' from source: play vars 13830 1727204120.68543: variable 'dhcp_interface1' from source: play vars 13830 1727204120.68551: variable 'port1_profile' from source: play vars 13830 1727204120.68612: variable 'port1_profile' from source: play vars 13830 1727204120.68617: variable 'dhcp_interface1' from source: play vars 13830 1727204120.68681: variable 'dhcp_interface1' from source: play vars 13830 1727204120.68686: variable 'controller_profile' from source: play vars 13830 1727204120.68746: variable 'controller_profile' from source: play vars 13830 1727204120.68752: variable 'port2_profile' from source: play vars 13830 1727204120.68814: variable 'port2_profile' from source: play vars 13830 1727204120.68824: variable 'dhcp_interface2' from source: play vars 13830 1727204120.68884: variable 'dhcp_interface2' from source: play vars 13830 1727204120.68889: variable 'controller_profile' from source: play vars 13830 1727204120.68948: variable 'controller_profile' from source: play vars 13830 1727204120.68955: variable 'ansible_distribution' from source: facts 13830 1727204120.68958: variable '__network_rh_distros' from source: role '' defaults 13830 1727204120.68965: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.68991: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204120.69185: variable 'ansible_distribution' from source: facts 13830 1727204120.69188: variable '__network_rh_distros' from source: role '' defaults 13830 1727204120.69193: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.69207: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204120.69382: variable 'ansible_distribution' from source: facts 13830 1727204120.69385: variable '__network_rh_distros' from source: role '' defaults 13830 1727204120.69391: variable 'ansible_distribution_major_version' from source: facts 13830 1727204120.69429: variable 'network_provider' from source: set_fact 13830 1727204120.69456: variable 'omit' from source: magic vars 13830 1727204120.69486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204120.69516: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204120.69538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204120.69555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204120.69567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204120.69596: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204120.69599: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204120.69601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204120.69702: Set connection var ansible_connection to ssh 13830 1727204120.69713: Set connection var ansible_timeout to 10 13830 1727204120.69722: Set connection var ansible_shell_executable to /bin/sh 13830 1727204120.69724: Set connection var ansible_shell_type to sh 13830 1727204120.69729: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204120.69742: Set connection var ansible_pipelining to False 13830 1727204120.69769: variable 'ansible_shell_executable' from source: unknown 13830 1727204120.69772: variable 'ansible_connection' from source: unknown 13830 1727204120.69775: variable 'ansible_module_compression' from source: unknown 13830 1727204120.69777: variable 'ansible_shell_type' from source: unknown 13830 1727204120.69781: variable 'ansible_shell_executable' from source: unknown 13830 1727204120.69783: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204120.69788: variable 'ansible_pipelining' from source: unknown 13830 1727204120.69790: variable 'ansible_timeout' from source: unknown 13830 1727204120.69794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204120.69912: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204120.69922: variable 'omit' from source: magic vars 13830 1727204120.69928: starting attempt loop 13830 1727204120.69931: running the handler 13830 1727204120.70020: variable 'ansible_facts' from source: unknown 13830 1727204120.71003: _low_level_execute_command(): starting 13830 1727204120.71009: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204120.71718: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204120.71728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.71745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.71755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.71797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.71804: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204120.71813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.71826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204120.71833: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204120.71842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204120.71852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.71859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.71871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.71878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.71885: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204120.71894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.71971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204120.71992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204120.72005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204120.72089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204120.73788: stdout chunk (state=3): >>>/root <<< 13830 1727204120.73966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204120.73970: stdout chunk (state=3): >>><<< 13830 1727204120.73980: stderr chunk (state=3): >>><<< 13830 1727204120.74000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204120.74011: _low_level_execute_command(): starting 13830 1727204120.74019: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687 `" && echo ansible-tmp-1727204120.7400014-17719-150333905500687="` echo /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687 `" ) && sleep 0' 13830 1727204120.74657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204120.74667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.74682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.74696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.74741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.74749: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204120.74760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.74776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204120.74784: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204120.74790: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204120.74800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.74806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.74818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.74826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.74832: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204120.74846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.74915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204120.74930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204120.74942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204120.75027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204120.77003: stdout chunk (state=3): >>>ansible-tmp-1727204120.7400014-17719-150333905500687=/root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687 <<< 13830 1727204120.77121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204120.77212: stderr chunk (state=3): >>><<< 13830 1727204120.77215: stdout chunk (state=3): >>><<< 13830 1727204120.77239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204120.7400014-17719-150333905500687=/root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204120.77275: variable 'ansible_module_compression' from source: unknown 13830 1727204120.77331: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13830 1727204120.77394: variable 'ansible_facts' from source: unknown 13830 1727204120.77582: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687/AnsiballZ_systemd.py 13830 1727204120.77742: Sending initial data 13830 1727204120.77746: Sent initial data (156 bytes) 13830 1727204120.78698: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204120.78706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.78716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.78729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.78774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.78778: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204120.78787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.78799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204120.78806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204120.78813: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204120.78820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.78829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.78844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.78851: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.78860: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204120.78866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.78946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204120.78961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204120.78965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204120.79052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204120.80900: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204120.80941: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204120.80983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp7ahp4fvu /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687/AnsiballZ_systemd.py <<< 13830 1727204120.81022: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204120.83915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204120.84036: stderr chunk (state=3): >>><<< 13830 1727204120.84039: stdout chunk (state=3): >>><<< 13830 1727204120.84042: done transferring module to remote 13830 1727204120.84044: _low_level_execute_command(): starting 13830 1727204120.84046: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687/ /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687/AnsiballZ_systemd.py && sleep 0' 13830 1727204120.84620: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204120.84637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.84652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.84674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.84719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.84732: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204120.84747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.84767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204120.84780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204120.84846: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204120.84859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.84874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.84889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.85516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.85528: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204120.85542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.85624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204120.85641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204120.85656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204120.85742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204120.87693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204120.87697: stdout chunk (state=3): >>><<< 13830 1727204120.87699: stderr chunk (state=3): >>><<< 13830 1727204120.87796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204120.87800: _low_level_execute_command(): starting 13830 1727204120.87803: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687/AnsiballZ_systemd.py && sleep 0' 13830 1727204120.89285: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204120.89348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.89456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.89480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.89521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.89532: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204120.89546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.89566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204120.89576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204120.89585: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204120.89594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204120.89605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204120.89618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204120.89627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204120.89645: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204120.89659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204120.89736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204120.89895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204120.89909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204120.90108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204121.15274: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15974400", "MemoryAvailable": "infinity", "CPUUsageNSec": "991888000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13830 1727204121.16829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204121.16833: stdout chunk (state=3): >>><<< 13830 1727204121.16836: stderr chunk (state=3): >>><<< 13830 1727204121.16871: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "15974400", "MemoryAvailable": "infinity", "CPUUsageNSec": "991888000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204121.17073: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204121.17077: _low_level_execute_command(): starting 13830 1727204121.17084: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204120.7400014-17719-150333905500687/ > /dev/null 2>&1 && sleep 0' 13830 1727204121.18807: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204121.18813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204121.18829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204121.18955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 13830 1727204121.18960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204121.19023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204121.19060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204121.19063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204121.19123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204121.20892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204121.20981: stderr chunk (state=3): >>><<< 13830 1727204121.20984: stdout chunk (state=3): >>><<< 13830 1727204121.21075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204121.21078: handler run complete 13830 1727204121.21273: attempt loop complete, returning result 13830 1727204121.21276: _execute() done 13830 1727204121.21278: dumping result to json 13830 1727204121.21279: done dumping result, returning 13830 1727204121.21281: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1659-6b02-000000000a3a] 13830 1727204121.21283: sending task result for task 0affcd87-79f5-1659-6b02-000000000a3a 13830 1727204121.21404: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a3a 13830 1727204121.21407: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204121.21473: no more pending results, returning what we have 13830 1727204121.21477: results queue empty 13830 1727204121.21478: checking for any_errors_fatal 13830 1727204121.21486: done checking for any_errors_fatal 13830 1727204121.21487: checking for max_fail_percentage 13830 1727204121.21488: done checking for max_fail_percentage 13830 1727204121.21489: checking to see if all hosts have failed and the running result is not ok 13830 1727204121.21490: done checking to see if all hosts have failed 13830 1727204121.21491: getting the remaining hosts for this loop 13830 1727204121.21492: done getting the remaining hosts for this loop 13830 1727204121.21496: getting the next task for host managed-node3 13830 1727204121.21504: done getting next task for host managed-node3 13830 1727204121.21508: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204121.21514: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204121.21530: getting variables 13830 1727204121.21535: in VariableManager get_vars() 13830 1727204121.21587: Calling all_inventory to load vars for managed-node3 13830 1727204121.21590: Calling groups_inventory to load vars for managed-node3 13830 1727204121.21593: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204121.21604: Calling all_plugins_play to load vars for managed-node3 13830 1727204121.21607: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204121.21610: Calling groups_plugins_play to load vars for managed-node3 13830 1727204121.26196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204121.29437: done with get_vars() 13830 1727204121.29463: done getting variables 13830 1727204121.29525: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.802) 0:00:54.373 ***** 13830 1727204121.29571: entering _queue_task() for managed-node3/service 13830 1727204121.30626: worker is 1 (out of 1 available) 13830 1727204121.30641: exiting _queue_task() for managed-node3/service 13830 1727204121.30654: done queuing things up, now waiting for results queue to drain 13830 1727204121.30656: waiting for pending results... 13830 1727204121.32171: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204121.32311: in run() - task 0affcd87-79f5-1659-6b02-000000000a3b 13830 1727204121.32324: variable 'ansible_search_path' from source: unknown 13830 1727204121.32328: variable 'ansible_search_path' from source: unknown 13830 1727204121.32823: calling self._execute() 13830 1727204121.32934: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204121.32939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204121.32949: variable 'omit' from source: magic vars 13830 1727204121.33731: variable 'ansible_distribution_major_version' from source: facts 13830 1727204121.33743: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204121.33978: variable 'network_provider' from source: set_fact 13830 1727204121.33982: Evaluated conditional (network_provider == "nm"): True 13830 1727204121.34299: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204121.34499: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204121.34850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204121.39672: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204121.39740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204121.39945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204121.39985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204121.40011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204121.40315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204121.40342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204121.40369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204121.40479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204121.40494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204121.40545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204121.40568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204121.40651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204121.40690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204121.40703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204121.40910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204121.40935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204121.41001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204121.41040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204121.41138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204121.41468: variable 'network_connections' from source: task vars 13830 1727204121.41482: variable 'controller_profile' from source: play vars 13830 1727204121.41596: variable 'controller_profile' from source: play vars 13830 1727204121.42400: variable 'controller_device' from source: play vars 13830 1727204121.42479: variable 'controller_device' from source: play vars 13830 1727204121.42489: variable 'dhcp_interface1' from source: play vars 13830 1727204121.42575: variable 'dhcp_interface1' from source: play vars 13830 1727204121.42578: variable 'port1_profile' from source: play vars 13830 1727204121.43130: variable 'port1_profile' from source: play vars 13830 1727204121.43137: variable 'dhcp_interface1' from source: play vars 13830 1727204121.43200: variable 'dhcp_interface1' from source: play vars 13830 1727204121.43206: variable 'controller_profile' from source: play vars 13830 1727204121.43268: variable 'controller_profile' from source: play vars 13830 1727204121.43461: variable 'port2_profile' from source: play vars 13830 1727204121.43522: variable 'port2_profile' from source: play vars 13830 1727204121.43529: variable 'dhcp_interface2' from source: play vars 13830 1727204121.43591: variable 'dhcp_interface2' from source: play vars 13830 1727204121.43603: variable 'controller_profile' from source: play vars 13830 1727204121.43654: variable 'controller_profile' from source: play vars 13830 1727204121.43738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204121.43927: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204121.43970: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204121.44007: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204121.44038: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204121.44088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204121.44110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204121.44141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204121.44171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204121.44224: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204121.44501: variable 'network_connections' from source: task vars 13830 1727204121.44506: variable 'controller_profile' from source: play vars 13830 1727204121.44572: variable 'controller_profile' from source: play vars 13830 1727204121.44579: variable 'controller_device' from source: play vars 13830 1727204121.44638: variable 'controller_device' from source: play vars 13830 1727204121.44645: variable 'dhcp_interface1' from source: play vars 13830 1727204121.44706: variable 'dhcp_interface1' from source: play vars 13830 1727204121.44740: variable 'port1_profile' from source: play vars 13830 1727204121.44805: variable 'port1_profile' from source: play vars 13830 1727204121.44813: variable 'dhcp_interface1' from source: play vars 13830 1727204121.44869: variable 'dhcp_interface1' from source: play vars 13830 1727204121.44877: variable 'controller_profile' from source: play vars 13830 1727204121.44938: variable 'controller_profile' from source: play vars 13830 1727204121.44941: variable 'port2_profile' from source: play vars 13830 1727204121.45026: variable 'port2_profile' from source: play vars 13830 1727204121.45032: variable 'dhcp_interface2' from source: play vars 13830 1727204121.45091: variable 'dhcp_interface2' from source: play vars 13830 1727204121.45097: variable 'controller_profile' from source: play vars 13830 1727204121.45168: variable 'controller_profile' from source: play vars 13830 1727204121.45218: Evaluated conditional (__network_wpa_supplicant_required): False 13830 1727204121.45221: when evaluation is False, skipping this task 13830 1727204121.45224: _execute() done 13830 1727204121.45226: dumping result to json 13830 1727204121.45228: done dumping result, returning 13830 1727204121.45238: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1659-6b02-000000000a3b] 13830 1727204121.45245: sending task result for task 0affcd87-79f5-1659-6b02-000000000a3b 13830 1727204121.45346: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a3b 13830 1727204121.45349: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13830 1727204121.45400: no more pending results, returning what we have 13830 1727204121.45404: results queue empty 13830 1727204121.45405: checking for any_errors_fatal 13830 1727204121.45421: done checking for any_errors_fatal 13830 1727204121.45422: checking for max_fail_percentage 13830 1727204121.45424: done checking for max_fail_percentage 13830 1727204121.45425: checking to see if all hosts have failed and the running result is not ok 13830 1727204121.45426: done checking to see if all hosts have failed 13830 1727204121.45426: getting the remaining hosts for this loop 13830 1727204121.45428: done getting the remaining hosts for this loop 13830 1727204121.45434: getting the next task for host managed-node3 13830 1727204121.45443: done getting next task for host managed-node3 13830 1727204121.45447: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204121.45452: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204121.45476: getting variables 13830 1727204121.45478: in VariableManager get_vars() 13830 1727204121.45519: Calling all_inventory to load vars for managed-node3 13830 1727204121.45522: Calling groups_inventory to load vars for managed-node3 13830 1727204121.45523: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204121.45534: Calling all_plugins_play to load vars for managed-node3 13830 1727204121.45537: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204121.45540: Calling groups_plugins_play to load vars for managed-node3 13830 1727204121.47590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204121.50118: done with get_vars() 13830 1727204121.50159: done getting variables 13830 1727204121.50227: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.206) 0:00:54.580 ***** 13830 1727204121.50270: entering _queue_task() for managed-node3/service 13830 1727204121.50636: worker is 1 (out of 1 available) 13830 1727204121.50650: exiting _queue_task() for managed-node3/service 13830 1727204121.50663: done queuing things up, now waiting for results queue to drain 13830 1727204121.51044: waiting for pending results... 13830 1727204121.51411: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204121.51584: in run() - task 0affcd87-79f5-1659-6b02-000000000a3c 13830 1727204121.51611: variable 'ansible_search_path' from source: unknown 13830 1727204121.51621: variable 'ansible_search_path' from source: unknown 13830 1727204121.51665: calling self._execute() 13830 1727204121.51765: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204121.51777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204121.51790: variable 'omit' from source: magic vars 13830 1727204121.52172: variable 'ansible_distribution_major_version' from source: facts 13830 1727204121.52191: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204121.52320: variable 'network_provider' from source: set_fact 13830 1727204121.52335: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204121.52343: when evaluation is False, skipping this task 13830 1727204121.52349: _execute() done 13830 1727204121.52361: dumping result to json 13830 1727204121.52371: done dumping result, returning 13830 1727204121.52382: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1659-6b02-000000000a3c] 13830 1727204121.52392: sending task result for task 0affcd87-79f5-1659-6b02-000000000a3c skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204121.52561: no more pending results, returning what we have 13830 1727204121.52568: results queue empty 13830 1727204121.52569: checking for any_errors_fatal 13830 1727204121.52581: done checking for any_errors_fatal 13830 1727204121.52582: checking for max_fail_percentage 13830 1727204121.52583: done checking for max_fail_percentage 13830 1727204121.52585: checking to see if all hosts have failed and the running result is not ok 13830 1727204121.52585: done checking to see if all hosts have failed 13830 1727204121.52586: getting the remaining hosts for this loop 13830 1727204121.52588: done getting the remaining hosts for this loop 13830 1727204121.52593: getting the next task for host managed-node3 13830 1727204121.52603: done getting next task for host managed-node3 13830 1727204121.52608: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204121.52614: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204121.52645: getting variables 13830 1727204121.52648: in VariableManager get_vars() 13830 1727204121.52701: Calling all_inventory to load vars for managed-node3 13830 1727204121.52705: Calling groups_inventory to load vars for managed-node3 13830 1727204121.52708: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204121.52720: Calling all_plugins_play to load vars for managed-node3 13830 1727204121.52723: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204121.52726: Calling groups_plugins_play to load vars for managed-node3 13830 1727204121.54491: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a3c 13830 1727204121.54495: WORKER PROCESS EXITING 13830 1727204121.55900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204121.58135: done with get_vars() 13830 1727204121.58169: done getting variables 13830 1727204121.58224: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.079) 0:00:54.660 ***** 13830 1727204121.58266: entering _queue_task() for managed-node3/copy 13830 1727204121.58592: worker is 1 (out of 1 available) 13830 1727204121.58605: exiting _queue_task() for managed-node3/copy 13830 1727204121.58615: done queuing things up, now waiting for results queue to drain 13830 1727204121.58616: waiting for pending results... 13830 1727204121.58942: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204121.59125: in run() - task 0affcd87-79f5-1659-6b02-000000000a3d 13830 1727204121.59148: variable 'ansible_search_path' from source: unknown 13830 1727204121.59155: variable 'ansible_search_path' from source: unknown 13830 1727204121.59204: calling self._execute() 13830 1727204121.59408: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204121.59433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204121.59451: variable 'omit' from source: magic vars 13830 1727204121.60804: variable 'ansible_distribution_major_version' from source: facts 13830 1727204121.60827: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204121.60980: variable 'network_provider' from source: set_fact 13830 1727204121.60992: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204121.60999: when evaluation is False, skipping this task 13830 1727204121.61006: _execute() done 13830 1727204121.61014: dumping result to json 13830 1727204121.61022: done dumping result, returning 13830 1727204121.61039: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1659-6b02-000000000a3d] 13830 1727204121.61050: sending task result for task 0affcd87-79f5-1659-6b02-000000000a3d skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13830 1727204121.61223: no more pending results, returning what we have 13830 1727204121.61227: results queue empty 13830 1727204121.61228: checking for any_errors_fatal 13830 1727204121.61237: done checking for any_errors_fatal 13830 1727204121.61238: checking for max_fail_percentage 13830 1727204121.61241: done checking for max_fail_percentage 13830 1727204121.61242: checking to see if all hosts have failed and the running result is not ok 13830 1727204121.61243: done checking to see if all hosts have failed 13830 1727204121.61243: getting the remaining hosts for this loop 13830 1727204121.61245: done getting the remaining hosts for this loop 13830 1727204121.61250: getting the next task for host managed-node3 13830 1727204121.61259: done getting next task for host managed-node3 13830 1727204121.61266: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204121.61271: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204121.61297: getting variables 13830 1727204121.61300: in VariableManager get_vars() 13830 1727204121.61350: Calling all_inventory to load vars for managed-node3 13830 1727204121.61354: Calling groups_inventory to load vars for managed-node3 13830 1727204121.61356: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204121.61372: Calling all_plugins_play to load vars for managed-node3 13830 1727204121.61375: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204121.61379: Calling groups_plugins_play to load vars for managed-node3 13830 1727204121.62384: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a3d 13830 1727204121.62388: WORKER PROCESS EXITING 13830 1727204121.63295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204121.64981: done with get_vars() 13830 1727204121.65011: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.068) 0:00:54.729 ***** 13830 1727204121.65109: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204121.65461: worker is 1 (out of 1 available) 13830 1727204121.65475: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204121.65487: done queuing things up, now waiting for results queue to drain 13830 1727204121.65489: waiting for pending results... 13830 1727204121.66793: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204121.66979: in run() - task 0affcd87-79f5-1659-6b02-000000000a3e 13830 1727204121.67003: variable 'ansible_search_path' from source: unknown 13830 1727204121.67012: variable 'ansible_search_path' from source: unknown 13830 1727204121.67060: calling self._execute() 13830 1727204121.67173: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204121.67185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204121.67201: variable 'omit' from source: magic vars 13830 1727204121.67598: variable 'ansible_distribution_major_version' from source: facts 13830 1727204121.67619: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204121.67631: variable 'omit' from source: magic vars 13830 1727204121.67716: variable 'omit' from source: magic vars 13830 1727204121.67885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204121.72201: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204121.72284: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204121.72330: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204121.72379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204121.72413: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204121.72502: variable 'network_provider' from source: set_fact 13830 1727204121.72800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204121.72835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204121.72909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204121.73029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204121.73046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204121.73237: variable 'omit' from source: magic vars 13830 1727204121.73413: variable 'omit' from source: magic vars 13830 1727204121.73754: variable 'network_connections' from source: task vars 13830 1727204121.73773: variable 'controller_profile' from source: play vars 13830 1727204121.73840: variable 'controller_profile' from source: play vars 13830 1727204121.73859: variable 'controller_device' from source: play vars 13830 1727204121.73928: variable 'controller_device' from source: play vars 13830 1727204121.74074: variable 'dhcp_interface1' from source: play vars 13830 1727204121.74134: variable 'dhcp_interface1' from source: play vars 13830 1727204121.74148: variable 'port1_profile' from source: play vars 13830 1727204121.74216: variable 'port1_profile' from source: play vars 13830 1727204121.74227: variable 'dhcp_interface1' from source: play vars 13830 1727204121.74294: variable 'dhcp_interface1' from source: play vars 13830 1727204121.74306: variable 'controller_profile' from source: play vars 13830 1727204121.74371: variable 'controller_profile' from source: play vars 13830 1727204121.74384: variable 'port2_profile' from source: play vars 13830 1727204121.74450: variable 'port2_profile' from source: play vars 13830 1727204121.74463: variable 'dhcp_interface2' from source: play vars 13830 1727204121.74554: variable 'dhcp_interface2' from source: play vars 13830 1727204121.74567: variable 'controller_profile' from source: play vars 13830 1727204121.74631: variable 'controller_profile' from source: play vars 13830 1727204121.74844: variable 'omit' from source: magic vars 13830 1727204121.74858: variable '__lsr_ansible_managed' from source: task vars 13830 1727204121.74924: variable '__lsr_ansible_managed' from source: task vars 13830 1727204121.75141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13830 1727204121.75357: Loaded config def from plugin (lookup/template) 13830 1727204121.75371: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13830 1727204121.75404: File lookup term: get_ansible_managed.j2 13830 1727204121.75412: variable 'ansible_search_path' from source: unknown 13830 1727204121.75421: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13830 1727204121.75439: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13830 1727204121.75459: variable 'ansible_search_path' from source: unknown 13830 1727204121.90306: variable 'ansible_managed' from source: unknown 13830 1727204121.90484: variable 'omit' from source: magic vars 13830 1727204121.90525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204121.90554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204121.90578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204121.90599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204121.90612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204121.90641: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204121.90649: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204121.90657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204121.90752: Set connection var ansible_connection to ssh 13830 1727204121.90769: Set connection var ansible_timeout to 10 13830 1727204121.90779: Set connection var ansible_shell_executable to /bin/sh 13830 1727204121.90794: Set connection var ansible_shell_type to sh 13830 1727204121.90805: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204121.90821: Set connection var ansible_pipelining to False 13830 1727204121.90853: variable 'ansible_shell_executable' from source: unknown 13830 1727204121.90861: variable 'ansible_connection' from source: unknown 13830 1727204121.90869: variable 'ansible_module_compression' from source: unknown 13830 1727204121.90875: variable 'ansible_shell_type' from source: unknown 13830 1727204121.90881: variable 'ansible_shell_executable' from source: unknown 13830 1727204121.90887: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204121.90893: variable 'ansible_pipelining' from source: unknown 13830 1727204121.90899: variable 'ansible_timeout' from source: unknown 13830 1727204121.90906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204121.91035: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204121.91050: variable 'omit' from source: magic vars 13830 1727204121.91069: starting attempt loop 13830 1727204121.91077: running the handler 13830 1727204121.91092: _low_level_execute_command(): starting 13830 1727204121.91102: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204121.91799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204121.91816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204121.91829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204121.91845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204121.91891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204121.91903: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204121.91916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204121.91936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204121.91947: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204121.91956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204121.91969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204121.91983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204121.91998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204121.92009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204121.92022: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204121.92039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204121.92120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204121.92137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204121.92154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204121.92233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204121.93881: stdout chunk (state=3): >>>/root <<< 13830 1727204121.94089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204121.94094: stdout chunk (state=3): >>><<< 13830 1727204121.94096: stderr chunk (state=3): >>><<< 13830 1727204121.94223: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204121.94227: _low_level_execute_command(): starting 13830 1727204121.94230: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590 `" && echo ansible-tmp-1727204121.941184-17770-269292512230590="` echo /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590 `" ) && sleep 0' 13830 1727204121.95061: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204121.95066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204121.95103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204121.95107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204121.95109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204121.95163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204121.95995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204121.95998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204121.96053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204121.97901: stdout chunk (state=3): >>>ansible-tmp-1727204121.941184-17770-269292512230590=/root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590 <<< 13830 1727204121.98011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204121.98098: stderr chunk (state=3): >>><<< 13830 1727204121.98101: stdout chunk (state=3): >>><<< 13830 1727204121.98472: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204121.941184-17770-269292512230590=/root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204121.98480: variable 'ansible_module_compression' from source: unknown 13830 1727204121.98482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13830 1727204121.98484: variable 'ansible_facts' from source: unknown 13830 1727204121.98486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590/AnsiballZ_network_connections.py 13830 1727204121.98826: Sending initial data 13830 1727204121.98830: Sent initial data (167 bytes) 13830 1727204122.01251: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204122.01314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.01330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.01351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.01398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.01426: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204122.01443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.01460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204122.01473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204122.01483: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204122.01493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.01504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.01523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.01541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.01552: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204122.01563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.01652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204122.01675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204122.01689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204122.01852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204122.03517: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204122.03552: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204122.03593: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpiq1fa3_9 /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590/AnsiballZ_network_connections.py <<< 13830 1727204122.03627: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204122.05416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204122.05483: stderr chunk (state=3): >>><<< 13830 1727204122.05487: stdout chunk (state=3): >>><<< 13830 1727204122.05508: done transferring module to remote 13830 1727204122.05518: _low_level_execute_command(): starting 13830 1727204122.05524: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590/ /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590/AnsiballZ_network_connections.py && sleep 0' 13830 1727204122.06167: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204122.06177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.06189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.06201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.06241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.06248: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204122.06258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.06273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204122.06281: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204122.06287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204122.06294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.06304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.06313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.06320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.06326: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204122.06337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.06399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204122.06423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204122.06426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204122.06497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204122.08188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204122.08295: stderr chunk (state=3): >>><<< 13830 1727204122.08299: stdout chunk (state=3): >>><<< 13830 1727204122.08302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204122.08310: _low_level_execute_command(): starting 13830 1727204122.08313: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590/AnsiballZ_network_connections.py && sleep 0' 13830 1727204122.08988: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204122.08995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.09006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.09019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.09061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.09070: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204122.09081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.09095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204122.09102: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204122.09109: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204122.09117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.09126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.09141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.09148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.09155: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204122.09167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.09256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204122.09259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204122.09268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204122.09347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204122.46058: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13830 1727204122.48092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204122.48198: stderr chunk (state=3): >>><<< 13830 1727204122.48202: stdout chunk (state=3): >>><<< 13830 1727204122.48271: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204122.48399: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'arp_interval': 60, 'arp_ip_target': '192.0.2.128', 'arp_validate': 'none', 'primary': 'test1'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204122.48403: _low_level_execute_command(): starting 13830 1727204122.48405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204121.941184-17770-269292512230590/ > /dev/null 2>&1 && sleep 0' 13830 1727204122.49095: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204122.49109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.49123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.49145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.49196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.49209: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204122.49224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.49245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204122.49256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204122.49270: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204122.49284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204122.49298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204122.49315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204122.49328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204122.49345: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204122.49360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204122.49445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204122.49468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204122.49489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204122.49559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204122.51459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204122.51465: stdout chunk (state=3): >>><<< 13830 1727204122.51468: stderr chunk (state=3): >>><<< 13830 1727204122.52074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204122.52078: handler run complete 13830 1727204122.52081: attempt loop complete, returning result 13830 1727204122.52083: _execute() done 13830 1727204122.52086: dumping result to json 13830 1727204122.52088: done dumping result, returning 13830 1727204122.52090: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1659-6b02-000000000a3e] 13830 1727204122.52093: sending task result for task 0affcd87-79f5-1659-6b02-000000000a3e 13830 1727204122.52197: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a3e 13830 1727204122.52202: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active) 13830 1727204122.52354: no more pending results, returning what we have 13830 1727204122.52358: results queue empty 13830 1727204122.52358: checking for any_errors_fatal 13830 1727204122.52384: done checking for any_errors_fatal 13830 1727204122.52385: checking for max_fail_percentage 13830 1727204122.52387: done checking for max_fail_percentage 13830 1727204122.52388: checking to see if all hosts have failed and the running result is not ok 13830 1727204122.52389: done checking to see if all hosts have failed 13830 1727204122.52390: getting the remaining hosts for this loop 13830 1727204122.52392: done getting the remaining hosts for this loop 13830 1727204122.52395: getting the next task for host managed-node3 13830 1727204122.52403: done getting next task for host managed-node3 13830 1727204122.52408: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204122.52413: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204122.52426: getting variables 13830 1727204122.52428: in VariableManager get_vars() 13830 1727204122.52472: Calling all_inventory to load vars for managed-node3 13830 1727204122.52480: Calling groups_inventory to load vars for managed-node3 13830 1727204122.52483: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204122.52492: Calling all_plugins_play to load vars for managed-node3 13830 1727204122.52495: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204122.52498: Calling groups_plugins_play to load vars for managed-node3 13830 1727204122.54548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204122.57886: done with get_vars() 13830 1727204122.57924: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.929) 0:00:55.658 ***** 13830 1727204122.58027: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204122.58805: worker is 1 (out of 1 available) 13830 1727204122.58819: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204122.58835: done queuing things up, now waiting for results queue to drain 13830 1727204122.58836: waiting for pending results... 13830 1727204122.59136: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204122.59308: in run() - task 0affcd87-79f5-1659-6b02-000000000a3f 13830 1727204122.59329: variable 'ansible_search_path' from source: unknown 13830 1727204122.59338: variable 'ansible_search_path' from source: unknown 13830 1727204122.59382: calling self._execute() 13830 1727204122.59484: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.59499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.59513: variable 'omit' from source: magic vars 13830 1727204122.59903: variable 'ansible_distribution_major_version' from source: facts 13830 1727204122.59921: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204122.60056: variable 'network_state' from source: role '' defaults 13830 1727204122.60073: Evaluated conditional (network_state != {}): False 13830 1727204122.60080: when evaluation is False, skipping this task 13830 1727204122.60086: _execute() done 13830 1727204122.60093: dumping result to json 13830 1727204122.60100: done dumping result, returning 13830 1727204122.60109: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1659-6b02-000000000a3f] 13830 1727204122.60118: sending task result for task 0affcd87-79f5-1659-6b02-000000000a3f 13830 1727204122.60241: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a3f 13830 1727204122.60248: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204122.60311: no more pending results, returning what we have 13830 1727204122.60315: results queue empty 13830 1727204122.60316: checking for any_errors_fatal 13830 1727204122.60330: done checking for any_errors_fatal 13830 1727204122.60331: checking for max_fail_percentage 13830 1727204122.60335: done checking for max_fail_percentage 13830 1727204122.60336: checking to see if all hosts have failed and the running result is not ok 13830 1727204122.60337: done checking to see if all hosts have failed 13830 1727204122.60338: getting the remaining hosts for this loop 13830 1727204122.60340: done getting the remaining hosts for this loop 13830 1727204122.60344: getting the next task for host managed-node3 13830 1727204122.60352: done getting next task for host managed-node3 13830 1727204122.60356: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204122.60362: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204122.60391: getting variables 13830 1727204122.60393: in VariableManager get_vars() 13830 1727204122.60440: Calling all_inventory to load vars for managed-node3 13830 1727204122.60443: Calling groups_inventory to load vars for managed-node3 13830 1727204122.60446: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204122.60458: Calling all_plugins_play to load vars for managed-node3 13830 1727204122.60460: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204122.60465: Calling groups_plugins_play to load vars for managed-node3 13830 1727204122.62194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204122.64071: done with get_vars() 13830 1727204122.64095: done getting variables 13830 1727204122.64163: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.061) 0:00:55.720 ***** 13830 1727204122.64207: entering _queue_task() for managed-node3/debug 13830 1727204122.64579: worker is 1 (out of 1 available) 13830 1727204122.64590: exiting _queue_task() for managed-node3/debug 13830 1727204122.64601: done queuing things up, now waiting for results queue to drain 13830 1727204122.64602: waiting for pending results... 13830 1727204122.64898: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204122.65077: in run() - task 0affcd87-79f5-1659-6b02-000000000a40 13830 1727204122.65096: variable 'ansible_search_path' from source: unknown 13830 1727204122.65103: variable 'ansible_search_path' from source: unknown 13830 1727204122.65145: calling self._execute() 13830 1727204122.65251: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.65267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.65283: variable 'omit' from source: magic vars 13830 1727204122.65660: variable 'ansible_distribution_major_version' from source: facts 13830 1727204122.65681: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204122.65694: variable 'omit' from source: magic vars 13830 1727204122.65769: variable 'omit' from source: magic vars 13830 1727204122.65812: variable 'omit' from source: magic vars 13830 1727204122.65862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204122.65905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204122.65937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204122.65959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204122.65977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204122.66008: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204122.66017: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.66028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.66131: Set connection var ansible_connection to ssh 13830 1727204122.66151: Set connection var ansible_timeout to 10 13830 1727204122.66160: Set connection var ansible_shell_executable to /bin/sh 13830 1727204122.66169: Set connection var ansible_shell_type to sh 13830 1727204122.66178: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204122.66191: Set connection var ansible_pipelining to False 13830 1727204122.66244: variable 'ansible_shell_executable' from source: unknown 13830 1727204122.66254: variable 'ansible_connection' from source: unknown 13830 1727204122.66261: variable 'ansible_module_compression' from source: unknown 13830 1727204122.66270: variable 'ansible_shell_type' from source: unknown 13830 1727204122.66276: variable 'ansible_shell_executable' from source: unknown 13830 1727204122.66282: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.66289: variable 'ansible_pipelining' from source: unknown 13830 1727204122.66295: variable 'ansible_timeout' from source: unknown 13830 1727204122.66302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.66452: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204122.66473: variable 'omit' from source: magic vars 13830 1727204122.66484: starting attempt loop 13830 1727204122.66490: running the handler 13830 1727204122.66628: variable '__network_connections_result' from source: set_fact 13830 1727204122.66705: handler run complete 13830 1727204122.66727: attempt loop complete, returning result 13830 1727204122.66736: _execute() done 13830 1727204122.66743: dumping result to json 13830 1727204122.66751: done dumping result, returning 13830 1727204122.66763: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1659-6b02-000000000a40] 13830 1727204122.66774: sending task result for task 0affcd87-79f5-1659-6b02-000000000a40 13830 1727204122.66886: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a40 ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active)" ] } 13830 1727204122.66973: no more pending results, returning what we have 13830 1727204122.66977: results queue empty 13830 1727204122.66978: checking for any_errors_fatal 13830 1727204122.66986: done checking for any_errors_fatal 13830 1727204122.66987: checking for max_fail_percentage 13830 1727204122.66988: done checking for max_fail_percentage 13830 1727204122.66989: checking to see if all hosts have failed and the running result is not ok 13830 1727204122.66990: done checking to see if all hosts have failed 13830 1727204122.66991: getting the remaining hosts for this loop 13830 1727204122.66993: done getting the remaining hosts for this loop 13830 1727204122.66997: getting the next task for host managed-node3 13830 1727204122.67006: done getting next task for host managed-node3 13830 1727204122.67011: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204122.67015: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204122.67030: getting variables 13830 1727204122.67034: in VariableManager get_vars() 13830 1727204122.67082: Calling all_inventory to load vars for managed-node3 13830 1727204122.67085: Calling groups_inventory to load vars for managed-node3 13830 1727204122.67087: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204122.67097: Calling all_plugins_play to load vars for managed-node3 13830 1727204122.67100: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204122.67109: Calling groups_plugins_play to load vars for managed-node3 13830 1727204122.73236: WORKER PROCESS EXITING 13830 1727204122.74247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204122.78103: done with get_vars() 13830 1727204122.78139: done getting variables 13830 1727204122.78194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.140) 0:00:55.860 ***** 13830 1727204122.78232: entering _queue_task() for managed-node3/debug 13830 1727204122.79386: worker is 1 (out of 1 available) 13830 1727204122.79398: exiting _queue_task() for managed-node3/debug 13830 1727204122.79409: done queuing things up, now waiting for results queue to drain 13830 1727204122.79410: waiting for pending results... 13830 1727204122.80056: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204122.80304: in run() - task 0affcd87-79f5-1659-6b02-000000000a41 13830 1727204122.80329: variable 'ansible_search_path' from source: unknown 13830 1727204122.80337: variable 'ansible_search_path' from source: unknown 13830 1727204122.80412: calling self._execute() 13830 1727204122.80547: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.80558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.80587: variable 'omit' from source: magic vars 13830 1727204122.81110: variable 'ansible_distribution_major_version' from source: facts 13830 1727204122.81128: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204122.81139: variable 'omit' from source: magic vars 13830 1727204122.81256: variable 'omit' from source: magic vars 13830 1727204122.81302: variable 'omit' from source: magic vars 13830 1727204122.81352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204122.81412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204122.81436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204122.81491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204122.81507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204122.81539: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204122.81546: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.81553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.81660: Set connection var ansible_connection to ssh 13830 1727204122.81756: Set connection var ansible_timeout to 10 13830 1727204122.81770: Set connection var ansible_shell_executable to /bin/sh 13830 1727204122.81777: Set connection var ansible_shell_type to sh 13830 1727204122.81786: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204122.81799: Set connection var ansible_pipelining to False 13830 1727204122.81829: variable 'ansible_shell_executable' from source: unknown 13830 1727204122.81836: variable 'ansible_connection' from source: unknown 13830 1727204122.81843: variable 'ansible_module_compression' from source: unknown 13830 1727204122.81848: variable 'ansible_shell_type' from source: unknown 13830 1727204122.81854: variable 'ansible_shell_executable' from source: unknown 13830 1727204122.81860: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.81870: variable 'ansible_pipelining' from source: unknown 13830 1727204122.81876: variable 'ansible_timeout' from source: unknown 13830 1727204122.81885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.82068: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204122.82086: variable 'omit' from source: magic vars 13830 1727204122.82095: starting attempt loop 13830 1727204122.82101: running the handler 13830 1727204122.82156: variable '__network_connections_result' from source: set_fact 13830 1727204122.82275: variable '__network_connections_result' from source: set_fact 13830 1727204122.82468: handler run complete 13830 1727204122.82568: attempt loop complete, returning result 13830 1727204122.82576: _execute() done 13830 1727204122.82583: dumping result to json 13830 1727204122.82595: done dumping result, returning 13830 1727204122.82607: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1659-6b02-000000000a41] 13830 1727204122.82616: sending task result for task 0affcd87-79f5-1659-6b02-000000000a41 ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active)" ] } } 13830 1727204122.82861: no more pending results, returning what we have 13830 1727204122.82867: results queue empty 13830 1727204122.82875: checking for any_errors_fatal 13830 1727204122.82885: done checking for any_errors_fatal 13830 1727204122.82886: checking for max_fail_percentage 13830 1727204122.82888: done checking for max_fail_percentage 13830 1727204122.82889: checking to see if all hosts have failed and the running result is not ok 13830 1727204122.82890: done checking to see if all hosts have failed 13830 1727204122.82890: getting the remaining hosts for this loop 13830 1727204122.82892: done getting the remaining hosts for this loop 13830 1727204122.82896: getting the next task for host managed-node3 13830 1727204122.82904: done getting next task for host managed-node3 13830 1727204122.82909: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204122.82914: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204122.82928: getting variables 13830 1727204122.82930: in VariableManager get_vars() 13830 1727204122.82992: Calling all_inventory to load vars for managed-node3 13830 1727204122.82995: Calling groups_inventory to load vars for managed-node3 13830 1727204122.82999: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204122.83010: Calling all_plugins_play to load vars for managed-node3 13830 1727204122.83012: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204122.83015: Calling groups_plugins_play to load vars for managed-node3 13830 1727204122.84188: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a41 13830 1727204122.84192: WORKER PROCESS EXITING 13830 1727204122.85652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204122.89158: done with get_vars() 13830 1727204122.89196: done getting variables 13830 1727204122.89258: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.119) 0:00:55.980 ***** 13830 1727204122.90206: entering _queue_task() for managed-node3/debug 13830 1727204122.90568: worker is 1 (out of 1 available) 13830 1727204122.90582: exiting _queue_task() for managed-node3/debug 13830 1727204122.90600: done queuing things up, now waiting for results queue to drain 13830 1727204122.90602: waiting for pending results... 13830 1727204122.91742: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204122.91908: in run() - task 0affcd87-79f5-1659-6b02-000000000a42 13830 1727204122.92102: variable 'ansible_search_path' from source: unknown 13830 1727204122.92110: variable 'ansible_search_path' from source: unknown 13830 1727204122.92156: calling self._execute() 13830 1727204122.92276: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204122.92293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204122.92308: variable 'omit' from source: magic vars 13830 1727204122.93167: variable 'ansible_distribution_major_version' from source: facts 13830 1727204122.93189: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204122.93437: variable 'network_state' from source: role '' defaults 13830 1727204122.93495: Evaluated conditional (network_state != {}): False 13830 1727204122.93538: when evaluation is False, skipping this task 13830 1727204122.93547: _execute() done 13830 1727204122.93575: dumping result to json 13830 1727204122.93586: done dumping result, returning 13830 1727204122.93601: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1659-6b02-000000000a42] 13830 1727204122.93638: sending task result for task 0affcd87-79f5-1659-6b02-000000000a42 skipping: [managed-node3] => { "false_condition": "network_state != {}" } 13830 1727204122.93866: no more pending results, returning what we have 13830 1727204122.93870: results queue empty 13830 1727204122.93871: checking for any_errors_fatal 13830 1727204122.93886: done checking for any_errors_fatal 13830 1727204122.93887: checking for max_fail_percentage 13830 1727204122.93889: done checking for max_fail_percentage 13830 1727204122.93890: checking to see if all hosts have failed and the running result is not ok 13830 1727204122.93891: done checking to see if all hosts have failed 13830 1727204122.93891: getting the remaining hosts for this loop 13830 1727204122.93893: done getting the remaining hosts for this loop 13830 1727204122.93897: getting the next task for host managed-node3 13830 1727204122.93905: done getting next task for host managed-node3 13830 1727204122.93908: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204122.93914: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204122.93941: getting variables 13830 1727204122.93944: in VariableManager get_vars() 13830 1727204122.93997: Calling all_inventory to load vars for managed-node3 13830 1727204122.94000: Calling groups_inventory to load vars for managed-node3 13830 1727204122.94002: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204122.94015: Calling all_plugins_play to load vars for managed-node3 13830 1727204122.94017: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204122.94019: Calling groups_plugins_play to load vars for managed-node3 13830 1727204122.95190: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a42 13830 1727204122.95194: WORKER PROCESS EXITING 13830 1727204122.96519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204123.00108: done with get_vars() 13830 1727204123.00144: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.100) 0:00:56.080 ***** 13830 1727204123.00252: entering _queue_task() for managed-node3/ping 13830 1727204123.01239: worker is 1 (out of 1 available) 13830 1727204123.01252: exiting _queue_task() for managed-node3/ping 13830 1727204123.01267: done queuing things up, now waiting for results queue to drain 13830 1727204123.01269: waiting for pending results... 13830 1727204123.01972: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204123.02536: in run() - task 0affcd87-79f5-1659-6b02-000000000a43 13830 1727204123.02555: variable 'ansible_search_path' from source: unknown 13830 1727204123.02558: variable 'ansible_search_path' from source: unknown 13830 1727204123.02800: calling self._execute() 13830 1727204123.02917: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204123.03106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204123.03123: variable 'omit' from source: magic vars 13830 1727204123.03929: variable 'ansible_distribution_major_version' from source: facts 13830 1727204123.03953: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204123.04080: variable 'omit' from source: magic vars 13830 1727204123.04172: variable 'omit' from source: magic vars 13830 1727204123.04316: variable 'omit' from source: magic vars 13830 1727204123.04369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204123.04414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204123.04695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204123.04853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204123.04874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204123.04909: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204123.04919: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204123.04928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204123.05042: Set connection var ansible_connection to ssh 13830 1727204123.05297: Set connection var ansible_timeout to 10 13830 1727204123.05308: Set connection var ansible_shell_executable to /bin/sh 13830 1727204123.05315: Set connection var ansible_shell_type to sh 13830 1727204123.05325: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204123.05341: Set connection var ansible_pipelining to False 13830 1727204123.05628: variable 'ansible_shell_executable' from source: unknown 13830 1727204123.05640: variable 'ansible_connection' from source: unknown 13830 1727204123.05649: variable 'ansible_module_compression' from source: unknown 13830 1727204123.05656: variable 'ansible_shell_type' from source: unknown 13830 1727204123.05662: variable 'ansible_shell_executable' from source: unknown 13830 1727204123.05673: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204123.05681: variable 'ansible_pipelining' from source: unknown 13830 1727204123.05689: variable 'ansible_timeout' from source: unknown 13830 1727204123.05697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204123.05961: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204123.06085: variable 'omit' from source: magic vars 13830 1727204123.06096: starting attempt loop 13830 1727204123.06104: running the handler 13830 1727204123.06124: _low_level_execute_command(): starting 13830 1727204123.06140: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204123.07960: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.07973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.07992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204123.08106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.08166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204123.08327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204123.08330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204123.08393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204123.10025: stdout chunk (state=3): >>>/root <<< 13830 1727204123.10228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204123.10232: stdout chunk (state=3): >>><<< 13830 1727204123.10238: stderr chunk (state=3): >>><<< 13830 1727204123.10369: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204123.10373: _low_level_execute_command(): starting 13830 1727204123.10376: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788 `" && echo ansible-tmp-1727204123.102629-17927-251865633120788="` echo /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788 `" ) && sleep 0' 13830 1727204123.11252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204123.11257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.11289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204123.11292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13830 1727204123.11303: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.11306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.11352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204123.11470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204123.11489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204123.11671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204123.13774: stdout chunk (state=3): >>>ansible-tmp-1727204123.102629-17927-251865633120788=/root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788 <<< 13830 1727204123.13951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204123.13956: stdout chunk (state=3): >>><<< 13830 1727204123.13962: stderr chunk (state=3): >>><<< 13830 1727204123.13987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204123.102629-17927-251865633120788=/root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204123.14039: variable 'ansible_module_compression' from source: unknown 13830 1727204123.14081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13830 1727204123.14115: variable 'ansible_facts' from source: unknown 13830 1727204123.14196: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788/AnsiballZ_ping.py 13830 1727204123.14700: Sending initial data 13830 1727204123.14704: Sent initial data (152 bytes) 13830 1727204123.17290: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204123.17295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.17326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.17330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.17332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.17516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204123.17519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204123.17521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204123.17582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204123.19369: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204123.19403: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204123.19442: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmphr680dno /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788/AnsiballZ_ping.py <<< 13830 1727204123.19481: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204123.20791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204123.21068: stderr chunk (state=3): >>><<< 13830 1727204123.21072: stdout chunk (state=3): >>><<< 13830 1727204123.21074: done transferring module to remote 13830 1727204123.21076: _low_level_execute_command(): starting 13830 1727204123.21079: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788/ /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788/AnsiballZ_ping.py && sleep 0' 13830 1727204123.22546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.22550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.22574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.22701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204123.22704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204123.22706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.22768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204123.22925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204123.22928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204123.22984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204123.24712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204123.24803: stderr chunk (state=3): >>><<< 13830 1727204123.24807: stdout chunk (state=3): >>><<< 13830 1727204123.24903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204123.24908: _low_level_execute_command(): starting 13830 1727204123.24910: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788/AnsiballZ_ping.py && sleep 0' 13830 1727204123.25696: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204123.25710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204123.25724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.25746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.25792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204123.25804: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204123.25817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.25836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204123.25847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204123.25860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204123.25874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204123.25889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.25906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.25919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204123.25931: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204123.25949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.26030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204123.26056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204123.26083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204123.26173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204123.39636: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13830 1727204123.40756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204123.40759: stdout chunk (state=3): >>><<< 13830 1727204123.40762: stderr chunk (state=3): >>><<< 13830 1727204123.40894: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204123.40899: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204123.40902: _low_level_execute_command(): starting 13830 1727204123.40905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204123.102629-17927-251865633120788/ > /dev/null 2>&1 && sleep 0' 13830 1727204123.42596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204123.42612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204123.42627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.42654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.42703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204123.42774: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204123.42790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.42808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204123.42821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204123.42832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204123.42848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204123.42861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204123.42883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204123.42901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204123.42912: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204123.42926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204123.43121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204123.43148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204123.43167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204123.43243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204123.45125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204123.45129: stdout chunk (state=3): >>><<< 13830 1727204123.45131: stderr chunk (state=3): >>><<< 13830 1727204123.45372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204123.45375: handler run complete 13830 1727204123.45378: attempt loop complete, returning result 13830 1727204123.45380: _execute() done 13830 1727204123.45382: dumping result to json 13830 1727204123.45384: done dumping result, returning 13830 1727204123.45386: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1659-6b02-000000000a43] 13830 1727204123.45389: sending task result for task 0affcd87-79f5-1659-6b02-000000000a43 13830 1727204123.45462: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a43 13830 1727204123.45468: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 13830 1727204123.45549: no more pending results, returning what we have 13830 1727204123.45554: results queue empty 13830 1727204123.45555: checking for any_errors_fatal 13830 1727204123.45566: done checking for any_errors_fatal 13830 1727204123.45567: checking for max_fail_percentage 13830 1727204123.45569: done checking for max_fail_percentage 13830 1727204123.45570: checking to see if all hosts have failed and the running result is not ok 13830 1727204123.45570: done checking to see if all hosts have failed 13830 1727204123.45571: getting the remaining hosts for this loop 13830 1727204123.45573: done getting the remaining hosts for this loop 13830 1727204123.45578: getting the next task for host managed-node3 13830 1727204123.45590: done getting next task for host managed-node3 13830 1727204123.45592: ^ task is: TASK: meta (role_complete) 13830 1727204123.45599: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204123.45613: getting variables 13830 1727204123.45615: in VariableManager get_vars() 13830 1727204123.45671: Calling all_inventory to load vars for managed-node3 13830 1727204123.45674: Calling groups_inventory to load vars for managed-node3 13830 1727204123.45677: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204123.45688: Calling all_plugins_play to load vars for managed-node3 13830 1727204123.45691: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204123.45695: Calling groups_plugins_play to load vars for managed-node3 13830 1727204123.48682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204123.53625: done with get_vars() 13830 1727204123.53668: done getting variables 13830 1727204123.53758: done queuing things up, now waiting for results queue to drain 13830 1727204123.53760: results queue empty 13830 1727204123.53761: checking for any_errors_fatal 13830 1727204123.53766: done checking for any_errors_fatal 13830 1727204123.53767: checking for max_fail_percentage 13830 1727204123.53768: done checking for max_fail_percentage 13830 1727204123.53769: checking to see if all hosts have failed and the running result is not ok 13830 1727204123.53770: done checking to see if all hosts have failed 13830 1727204123.53771: getting the remaining hosts for this loop 13830 1727204123.53772: done getting the remaining hosts for this loop 13830 1727204123.53775: getting the next task for host managed-node3 13830 1727204123.53781: done getting next task for host managed-node3 13830 1727204123.53783: ^ task is: TASK: Show result 13830 1727204123.53786: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204123.53788: getting variables 13830 1727204123.53789: in VariableManager get_vars() 13830 1727204123.53806: Calling all_inventory to load vars for managed-node3 13830 1727204123.53809: Calling groups_inventory to load vars for managed-node3 13830 1727204123.53811: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204123.53816: Calling all_plugins_play to load vars for managed-node3 13830 1727204123.53819: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204123.53822: Calling groups_plugins_play to load vars for managed-node3 13830 1727204123.56008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204123.60091: done with get_vars() 13830 1727204123.60115: done getting variables 13830 1727204123.60170: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml:33 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.599) 0:00:56.680 ***** 13830 1727204123.60206: entering _queue_task() for managed-node3/debug 13830 1727204123.61268: worker is 1 (out of 1 available) 13830 1727204123.61283: exiting _queue_task() for managed-node3/debug 13830 1727204123.61295: done queuing things up, now waiting for results queue to drain 13830 1727204123.61296: waiting for pending results... 13830 1727204123.62188: running TaskExecutor() for managed-node3/TASK: Show result 13830 1727204123.62419: in run() - task 0affcd87-79f5-1659-6b02-000000000a73 13830 1727204123.62548: variable 'ansible_search_path' from source: unknown 13830 1727204123.62553: variable 'ansible_search_path' from source: unknown 13830 1727204123.62590: calling self._execute() 13830 1727204123.62804: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204123.62810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204123.62821: variable 'omit' from source: magic vars 13830 1727204123.63653: variable 'ansible_distribution_major_version' from source: facts 13830 1727204123.63667: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204123.63673: variable 'omit' from source: magic vars 13830 1727204123.63696: variable 'omit' from source: magic vars 13830 1727204123.63844: variable 'omit' from source: magic vars 13830 1727204123.63889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204123.63927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204123.64059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204123.64079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204123.64091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204123.64121: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204123.64125: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204123.64127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204123.64340: Set connection var ansible_connection to ssh 13830 1727204123.64351: Set connection var ansible_timeout to 10 13830 1727204123.64357: Set connection var ansible_shell_executable to /bin/sh 13830 1727204123.64359: Set connection var ansible_shell_type to sh 13830 1727204123.64366: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204123.64376: Set connection var ansible_pipelining to False 13830 1727204123.64609: variable 'ansible_shell_executable' from source: unknown 13830 1727204123.64612: variable 'ansible_connection' from source: unknown 13830 1727204123.64615: variable 'ansible_module_compression' from source: unknown 13830 1727204123.64617: variable 'ansible_shell_type' from source: unknown 13830 1727204123.64620: variable 'ansible_shell_executable' from source: unknown 13830 1727204123.64622: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204123.64624: variable 'ansible_pipelining' from source: unknown 13830 1727204123.64627: variable 'ansible_timeout' from source: unknown 13830 1727204123.64635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204123.64887: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204123.64898: variable 'omit' from source: magic vars 13830 1727204123.64903: starting attempt loop 13830 1727204123.64905: running the handler 13830 1727204123.65005: variable '__network_connections_result' from source: set_fact 13830 1727204123.65202: variable '__network_connections_result' from source: set_fact 13830 1727204123.65613: handler run complete 13830 1727204123.65644: attempt loop complete, returning result 13830 1727204123.65647: _execute() done 13830 1727204123.65650: dumping result to json 13830 1727204123.65654: done dumping result, returning 13830 1727204123.65662: done running TaskExecutor() for managed-node3/TASK: Show result [0affcd87-79f5-1659-6b02-000000000a73] 13830 1727204123.65670: sending task result for task 0affcd87-79f5-1659-6b02-000000000a73 13830 1727204123.65885: done sending task result for task 0affcd87-79f5-1659-6b02-000000000a73 13830 1727204123.65888: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, aaf1f626-5889-4bd6-9eb8-491a8b173119 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, e8dfd867-3423-41f0-b2ba-561a2a4c7934 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 12faa5fe-601d-4179-9d4d-c366327061e9 (not-active)" ] } } 13830 1727204123.65993: no more pending results, returning what we have 13830 1727204123.65997: results queue empty 13830 1727204123.66004: checking for any_errors_fatal 13830 1727204123.66007: done checking for any_errors_fatal 13830 1727204123.66008: checking for max_fail_percentage 13830 1727204123.66010: done checking for max_fail_percentage 13830 1727204123.66011: checking to see if all hosts have failed and the running result is not ok 13830 1727204123.66012: done checking to see if all hosts have failed 13830 1727204123.66013: getting the remaining hosts for this loop 13830 1727204123.66015: done getting the remaining hosts for this loop 13830 1727204123.66020: getting the next task for host managed-node3 13830 1727204123.66030: done getting next task for host managed-node3 13830 1727204123.66037: ^ task is: TASK: Asserts 13830 1727204123.66041: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204123.66046: getting variables 13830 1727204123.66048: in VariableManager get_vars() 13830 1727204123.66095: Calling all_inventory to load vars for managed-node3 13830 1727204123.66098: Calling groups_inventory to load vars for managed-node3 13830 1727204123.66101: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204123.66112: Calling all_plugins_play to load vars for managed-node3 13830 1727204123.66114: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204123.66118: Calling groups_plugins_play to load vars for managed-node3 13830 1727204123.68261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204123.72216: done with get_vars() 13830 1727204123.72260: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.121) 0:00:56.801 ***** 13830 1727204123.72368: entering _queue_task() for managed-node3/include_tasks 13830 1727204123.73236: worker is 1 (out of 1 available) 13830 1727204123.73248: exiting _queue_task() for managed-node3/include_tasks 13830 1727204123.73261: done queuing things up, now waiting for results queue to drain 13830 1727204123.73262: waiting for pending results... 13830 1727204123.74435: running TaskExecutor() for managed-node3/TASK: Asserts 13830 1727204123.74672: in run() - task 0affcd87-79f5-1659-6b02-0000000008ef 13830 1727204123.74684: variable 'ansible_search_path' from source: unknown 13830 1727204123.74688: variable 'ansible_search_path' from source: unknown 13830 1727204123.74849: variable 'lsr_assert' from source: include params 13830 1727204123.75283: variable 'lsr_assert' from source: include params 13830 1727204123.75348: variable 'omit' from source: magic vars 13830 1727204123.75939: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204123.75958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204123.75962: variable 'omit' from source: magic vars 13830 1727204123.76428: variable 'ansible_distribution_major_version' from source: facts 13830 1727204123.76438: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204123.76445: variable 'item' from source: unknown 13830 1727204123.76638: variable 'item' from source: unknown 13830 1727204123.76668: variable 'item' from source: unknown 13830 1727204123.76843: variable 'item' from source: unknown 13830 1727204123.76988: dumping result to json 13830 1727204123.76991: done dumping result, returning 13830 1727204123.76994: done running TaskExecutor() for managed-node3/TASK: Asserts [0affcd87-79f5-1659-6b02-0000000008ef] 13830 1727204123.76996: sending task result for task 0affcd87-79f5-1659-6b02-0000000008ef 13830 1727204123.77074: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008ef 13830 1727204123.77077: WORKER PROCESS EXITING 13830 1727204123.77127: no more pending results, returning what we have 13830 1727204123.77135: in VariableManager get_vars() 13830 1727204123.77193: Calling all_inventory to load vars for managed-node3 13830 1727204123.77196: Calling groups_inventory to load vars for managed-node3 13830 1727204123.77199: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204123.77214: Calling all_plugins_play to load vars for managed-node3 13830 1727204123.77217: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204123.77220: Calling groups_plugins_play to load vars for managed-node3 13830 1727204123.80413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204123.84074: done with get_vars() 13830 1727204123.84105: variable 'ansible_search_path' from source: unknown 13830 1727204123.84106: variable 'ansible_search_path' from source: unknown 13830 1727204123.84152: we have included files to process 13830 1727204123.84153: generating all_blocks data 13830 1727204123.84155: done generating all_blocks data 13830 1727204123.84161: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 13830 1727204123.84162: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 13830 1727204123.84166: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 13830 1727204123.84445: in VariableManager get_vars() 13830 1727204123.84475: done with get_vars() 13830 1727204123.84517: in VariableManager get_vars() 13830 1727204123.84544: done with get_vars() 13830 1727204123.84558: done processing included file 13830 1727204123.84560: iterating over new_blocks loaded from include file 13830 1727204123.84561: in VariableManager get_vars() 13830 1727204123.85289: done with get_vars() 13830 1727204123.85292: filtering new block on tags 13830 1727204123.85345: done filtering new block on tags 13830 1727204123.85348: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed-node3 => (item=tasks/assert_bond_options.yml) 13830 1727204123.85354: extending task lists for all hosts with included blocks 13830 1727204123.94801: done extending task lists 13830 1727204123.94803: done processing included files 13830 1727204123.94804: results queue empty 13830 1727204123.94805: checking for any_errors_fatal 13830 1727204123.94810: done checking for any_errors_fatal 13830 1727204123.94812: checking for max_fail_percentage 13830 1727204123.94813: done checking for max_fail_percentage 13830 1727204123.94814: checking to see if all hosts have failed and the running result is not ok 13830 1727204123.94815: done checking to see if all hosts have failed 13830 1727204123.94816: getting the remaining hosts for this loop 13830 1727204123.94817: done getting the remaining hosts for this loop 13830 1727204123.94820: getting the next task for host managed-node3 13830 1727204123.94825: done getting next task for host managed-node3 13830 1727204123.94827: ^ task is: TASK: ** TEST check bond settings 13830 1727204123.94831: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204123.94837: getting variables 13830 1727204123.94838: in VariableManager get_vars() 13830 1727204123.94858: Calling all_inventory to load vars for managed-node3 13830 1727204123.94861: Calling groups_inventory to load vars for managed-node3 13830 1727204123.94865: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204123.94871: Calling all_plugins_play to load vars for managed-node3 13830 1727204123.94874: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204123.94877: Calling groups_plugins_play to load vars for managed-node3 13830 1727204123.97466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204124.01172: done with get_vars() 13830 1727204124.01209: done getting variables 13830 1727204124.01259: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.289) 0:00:57.091 ***** 13830 1727204124.01296: entering _queue_task() for managed-node3/command 13830 1727204124.02359: worker is 1 (out of 1 available) 13830 1727204124.02375: exiting _queue_task() for managed-node3/command 13830 1727204124.02388: done queuing things up, now waiting for results queue to drain 13830 1727204124.02389: waiting for pending results... 13830 1727204124.03139: running TaskExecutor() for managed-node3/TASK: ** TEST check bond settings 13830 1727204124.03359: in run() - task 0affcd87-79f5-1659-6b02-000000000c2a 13830 1727204124.03500: variable 'ansible_search_path' from source: unknown 13830 1727204124.03510: variable 'ansible_search_path' from source: unknown 13830 1727204124.03571: variable 'bond_options_to_assert' from source: set_fact 13830 1727204124.04067: variable 'bond_options_to_assert' from source: set_fact 13830 1727204124.04296: variable 'omit' from source: magic vars 13830 1727204124.04573: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.04701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.04721: variable 'omit' from source: magic vars 13830 1727204124.05274: variable 'ansible_distribution_major_version' from source: facts 13830 1727204124.05371: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204124.05382: variable 'omit' from source: magic vars 13830 1727204124.05430: variable 'omit' from source: magic vars 13830 1727204124.05868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204124.11584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204124.11736: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204124.11909: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204124.11955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204124.11993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204124.12336: variable 'controller_device' from source: play vars 13830 1727204124.12350: variable 'bond_opt' from source: unknown 13830 1727204124.12382: variable 'omit' from source: magic vars 13830 1727204124.12425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204124.12548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204124.12577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204124.12599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204124.12613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204124.12771: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204124.12780: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.12788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.12903: Set connection var ansible_connection to ssh 13830 1727204124.13074: Set connection var ansible_timeout to 10 13830 1727204124.13086: Set connection var ansible_shell_executable to /bin/sh 13830 1727204124.13093: Set connection var ansible_shell_type to sh 13830 1727204124.13104: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204124.13117: Set connection var ansible_pipelining to False 13830 1727204124.13148: variable 'ansible_shell_executable' from source: unknown 13830 1727204124.13157: variable 'ansible_connection' from source: unknown 13830 1727204124.13169: variable 'ansible_module_compression' from source: unknown 13830 1727204124.13178: variable 'ansible_shell_type' from source: unknown 13830 1727204124.13185: variable 'ansible_shell_executable' from source: unknown 13830 1727204124.13191: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.13197: variable 'ansible_pipelining' from source: unknown 13830 1727204124.13277: variable 'ansible_timeout' from source: unknown 13830 1727204124.13285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.13515: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204124.13531: variable 'omit' from source: magic vars 13830 1727204124.13544: starting attempt loop 13830 1727204124.13550: running the handler 13830 1727204124.13571: _low_level_execute_command(): starting 13830 1727204124.13582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204124.15880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.15886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.16028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204124.16035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.16038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.16105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.16109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.16216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.16379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.18016: stdout chunk (state=3): >>>/root <<< 13830 1727204124.18114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.18201: stderr chunk (state=3): >>><<< 13830 1727204124.18204: stdout chunk (state=3): >>><<< 13830 1727204124.18317: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.18327: _low_level_execute_command(): starting 13830 1727204124.18330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678 `" && echo ansible-tmp-1727204124.1822805-17990-209212328125678="` echo /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678 `" ) && sleep 0' 13830 1727204124.19831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.19962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.19982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.20003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.20052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.20070: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204124.20086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.20104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204124.20117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204124.20129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204124.20146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.20163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.20185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.20199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.20212: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204124.20227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.20404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.20429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.20448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.20529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.22389: stdout chunk (state=3): >>>ansible-tmp-1727204124.1822805-17990-209212328125678=/root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678 <<< 13830 1727204124.22602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.22606: stdout chunk (state=3): >>><<< 13830 1727204124.22608: stderr chunk (state=3): >>><<< 13830 1727204124.22924: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204124.1822805-17990-209212328125678=/root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.22929: variable 'ansible_module_compression' from source: unknown 13830 1727204124.22931: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204124.22936: variable 'ansible_facts' from source: unknown 13830 1727204124.22937: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678/AnsiballZ_command.py 13830 1727204124.23002: Sending initial data 13830 1727204124.23005: Sent initial data (156 bytes) 13830 1727204124.24420: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.24429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.24440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.24460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.24497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.24505: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204124.24524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.24538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204124.24544: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204124.24552: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204124.24558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.24570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.24584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.24592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.24599: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204124.24608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.24687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.24701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.24714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.24788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.26476: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204124.26511: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204124.26555: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmple1wvu55 /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678/AnsiballZ_command.py <<< 13830 1727204124.26594: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204124.28017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.28152: stderr chunk (state=3): >>><<< 13830 1727204124.28156: stdout chunk (state=3): >>><<< 13830 1727204124.28159: done transferring module to remote 13830 1727204124.28161: _low_level_execute_command(): starting 13830 1727204124.28169: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678/ /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678/AnsiballZ_command.py && sleep 0' 13830 1727204124.29478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.30147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.30168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.30187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.30239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.30252: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204124.30269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.30288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204124.30300: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204124.30313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204124.30325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.30341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.30356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.30371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.30381: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204124.30393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.30474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.30492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.30508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.30585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.32386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.32391: stdout chunk (state=3): >>><<< 13830 1727204124.32395: stderr chunk (state=3): >>><<< 13830 1727204124.32494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.32498: _low_level_execute_command(): starting 13830 1727204124.32501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678/AnsiballZ_command.py && sleep 0' 13830 1727204124.34078: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.34082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.34085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.34183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13830 1727204124.34187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.34367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.34380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.34460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.48081: stdout chunk (state=3): >>> {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:55:24.476770", "end": "2024-09-24 14:55:24.479652", "delta": "0:00:00.002882", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204124.49281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204124.49285: stdout chunk (state=3): >>><<< 13830 1727204124.49293: stderr chunk (state=3): >>><<< 13830 1727204124.49316: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-24 14:55:24.476770", "end": "2024-09-24 14:55:24.479652", "delta": "0:00:00.002882", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204124.49356: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204124.49365: _low_level_execute_command(): starting 13830 1727204124.49370: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204124.1822805-17990-209212328125678/ > /dev/null 2>&1 && sleep 0' 13830 1727204124.50946: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.50984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.51003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.51016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.51125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.51132: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204124.51146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.51159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204124.51167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204124.51174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204124.51181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.51196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.51213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.51221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.51227: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204124.51309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.51385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.51415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.51433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.51580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.53399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.53403: stdout chunk (state=3): >>><<< 13830 1727204124.53410: stderr chunk (state=3): >>><<< 13830 1727204124.53434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.53443: handler run complete 13830 1727204124.53470: Evaluated conditional (False): False 13830 1727204124.53636: variable 'bond_opt' from source: unknown 13830 1727204124.53645: variable 'result' from source: set_fact 13830 1727204124.53660: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204124.53680: attempt loop complete, returning result 13830 1727204124.53700: variable 'bond_opt' from source: unknown 13830 1727204124.53770: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'mode', 'value': 'active-backup'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "active-backup" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.002882", "end": "2024-09-24 14:55:24.479652", "rc": 0, "start": "2024-09-24 14:55:24.476770" } STDOUT: active-backup 1 13830 1727204124.53985: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.53989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.53991: variable 'omit' from source: magic vars 13830 1727204124.54223: variable 'ansible_distribution_major_version' from source: facts 13830 1727204124.54228: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204124.54233: variable 'omit' from source: magic vars 13830 1727204124.54251: variable 'omit' from source: magic vars 13830 1727204124.54668: variable 'controller_device' from source: play vars 13830 1727204124.54671: variable 'bond_opt' from source: unknown 13830 1727204124.54692: variable 'omit' from source: magic vars 13830 1727204124.54716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204124.54721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204124.54728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204124.54860: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204124.54869: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.54875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.54943: Set connection var ansible_connection to ssh 13830 1727204124.54951: Set connection var ansible_timeout to 10 13830 1727204124.54956: Set connection var ansible_shell_executable to /bin/sh 13830 1727204124.54958: Set connection var ansible_shell_type to sh 13830 1727204124.55079: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204124.55095: Set connection var ansible_pipelining to False 13830 1727204124.55116: variable 'ansible_shell_executable' from source: unknown 13830 1727204124.55120: variable 'ansible_connection' from source: unknown 13830 1727204124.55122: variable 'ansible_module_compression' from source: unknown 13830 1727204124.55124: variable 'ansible_shell_type' from source: unknown 13830 1727204124.55127: variable 'ansible_shell_executable' from source: unknown 13830 1727204124.55129: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.55135: variable 'ansible_pipelining' from source: unknown 13830 1727204124.55138: variable 'ansible_timeout' from source: unknown 13830 1727204124.55142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.55362: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204124.55374: variable 'omit' from source: magic vars 13830 1727204124.55377: starting attempt loop 13830 1727204124.55380: running the handler 13830 1727204124.55388: _low_level_execute_command(): starting 13830 1727204124.55391: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204124.57094: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.57155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.57167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.57191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.57234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.57274: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204124.57290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.57309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204124.57340: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204124.57347: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204124.57357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.57373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.57405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.57418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.57425: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204124.57435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.57566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.57629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.57649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.57720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.59246: stdout chunk (state=3): >>>/root <<< 13830 1727204124.59427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.59430: stdout chunk (state=3): >>><<< 13830 1727204124.59441: stderr chunk (state=3): >>><<< 13830 1727204124.59469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.59479: _low_level_execute_command(): starting 13830 1727204124.59485: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538 `" && echo ansible-tmp-1727204124.5946882-17990-152828505271538="` echo /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538 `" ) && sleep 0' 13830 1727204124.61397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.61433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.61446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.61532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.61577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.61584: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204124.61594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.61606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204124.61614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204124.61621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204124.61630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.61652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.61663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.61672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.61679: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204124.61707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.61895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.62007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.62010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.62014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.63797: stdout chunk (state=3): >>>ansible-tmp-1727204124.5946882-17990-152828505271538=/root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538 <<< 13830 1727204124.63979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.63983: stdout chunk (state=3): >>><<< 13830 1727204124.63989: stderr chunk (state=3): >>><<< 13830 1727204124.64020: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204124.5946882-17990-152828505271538=/root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.64048: variable 'ansible_module_compression' from source: unknown 13830 1727204124.64093: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204124.64114: variable 'ansible_facts' from source: unknown 13830 1727204124.64184: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538/AnsiballZ_command.py 13830 1727204124.64624: Sending initial data 13830 1727204124.64628: Sent initial data (156 bytes) 13830 1727204124.68804: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.68808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.69010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.69014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.69033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204124.69041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.69109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.69282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.69286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.69355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.71069: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204124.71112: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204124.71161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmphe2gwar3 /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538/AnsiballZ_command.py <<< 13830 1727204124.71197: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204124.72507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.72683: stderr chunk (state=3): >>><<< 13830 1727204124.72686: stdout chunk (state=3): >>><<< 13830 1727204124.72689: done transferring module to remote 13830 1727204124.72691: _low_level_execute_command(): starting 13830 1727204124.72693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538/ /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538/AnsiballZ_command.py && sleep 0' 13830 1727204124.74054: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.74058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.74098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.74101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.74104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.74189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.74192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.74195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.74342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.75942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.76027: stderr chunk (state=3): >>><<< 13830 1727204124.76032: stdout chunk (state=3): >>><<< 13830 1727204124.76130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.76136: _low_level_execute_command(): starting 13830 1727204124.76139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538/AnsiballZ_command.py && sleep 0' 13830 1727204124.78207: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.78328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.78335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.78370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204124.78385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.78392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.78558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.78568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.78583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.78665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.92627: stdout chunk (state=3): >>> {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-24 14:55:24.922460", "end": "2024-09-24 14:55:24.925362", "delta": "0:00:00.002902", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204124.93827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204124.93831: stdout chunk (state=3): >>><<< 13830 1727204124.93836: stderr chunk (state=3): >>><<< 13830 1727204124.93858: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-24 14:55:24.922460", "end": "2024-09-24 14:55:24.925362", "delta": "0:00:00.002902", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204124.93895: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204124.93902: _low_level_execute_command(): starting 13830 1727204124.93905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204124.5946882-17990-152828505271538/ > /dev/null 2>&1 && sleep 0' 13830 1727204124.95229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204124.96085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.96096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.96109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.96151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.96159: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204124.96171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.96185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204124.96193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204124.96204: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204124.96207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204124.96215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204124.96226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204124.96236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204124.96239: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204124.96249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204124.96320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204124.96338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204124.96350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204124.96422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204124.98269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204124.98285: stdout chunk (state=3): >>><<< 13830 1727204124.98290: stderr chunk (state=3): >>><<< 13830 1727204124.98295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204124.98303: handler run complete 13830 1727204124.98326: Evaluated conditional (False): False 13830 1727204124.98479: variable 'bond_opt' from source: unknown 13830 1727204124.98484: variable 'result' from source: set_fact 13830 1727204124.98499: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204124.98510: attempt loop complete, returning result 13830 1727204124.98531: variable 'bond_opt' from source: unknown 13830 1727204124.98598: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'arp_interval', 'value': '60'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_interval", "value": "60" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_interval" ], "delta": "0:00:00.002902", "end": "2024-09-24 14:55:24.925362", "rc": 0, "start": "2024-09-24 14:55:24.922460" } STDOUT: 60 13830 1727204124.98744: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.98748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.98750: variable 'omit' from source: magic vars 13830 1727204124.99011: variable 'ansible_distribution_major_version' from source: facts 13830 1727204124.99015: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204124.99021: variable 'omit' from source: magic vars 13830 1727204124.99037: variable 'omit' from source: magic vars 13830 1727204124.99311: variable 'controller_device' from source: play vars 13830 1727204124.99440: variable 'bond_opt' from source: unknown 13830 1727204124.99459: variable 'omit' from source: magic vars 13830 1727204124.99482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204124.99490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204124.99497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204124.99511: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204124.99514: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.99517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204124.99716: Set connection var ansible_connection to ssh 13830 1727204124.99723: Set connection var ansible_timeout to 10 13830 1727204124.99729: Set connection var ansible_shell_executable to /bin/sh 13830 1727204124.99731: Set connection var ansible_shell_type to sh 13830 1727204124.99737: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204124.99747: Set connection var ansible_pipelining to False 13830 1727204124.99940: variable 'ansible_shell_executable' from source: unknown 13830 1727204124.99943: variable 'ansible_connection' from source: unknown 13830 1727204124.99946: variable 'ansible_module_compression' from source: unknown 13830 1727204124.99949: variable 'ansible_shell_type' from source: unknown 13830 1727204124.99951: variable 'ansible_shell_executable' from source: unknown 13830 1727204124.99953: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204124.99957: variable 'ansible_pipelining' from source: unknown 13830 1727204124.99959: variable 'ansible_timeout' from source: unknown 13830 1727204124.99964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204125.00291: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204125.00319: variable 'omit' from source: magic vars 13830 1727204125.00322: starting attempt loop 13830 1727204125.00324: running the handler 13830 1727204125.00331: _low_level_execute_command(): starting 13830 1727204125.00336: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204125.01270: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.01274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.01276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.01279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.01281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.01283: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.01285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.01287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.01289: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.01292: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.01420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.01423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.01426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.01428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.01430: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.01435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.01438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.01440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.01442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.01499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.03197: stdout chunk (state=3): >>>/root <<< 13830 1727204125.03200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.03203: stdout chunk (state=3): >>><<< 13830 1727204125.03205: stderr chunk (state=3): >>><<< 13830 1727204125.03225: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.03237: _low_level_execute_command(): starting 13830 1727204125.03240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161 `" && echo ansible-tmp-1727204125.0322478-17990-116739612725161="` echo /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161 `" ) && sleep 0' 13830 1727204125.04538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.04890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.04894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.04939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.04946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.04961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.04965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.04982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.04987: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.05091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.05280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.05332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.07155: stdout chunk (state=3): >>>ansible-tmp-1727204125.0322478-17990-116739612725161=/root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161 <<< 13830 1727204125.07328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.07357: stderr chunk (state=3): >>><<< 13830 1727204125.07360: stdout chunk (state=3): >>><<< 13830 1727204125.07607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204125.0322478-17990-116739612725161=/root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.07611: variable 'ansible_module_compression' from source: unknown 13830 1727204125.07614: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204125.07616: variable 'ansible_facts' from source: unknown 13830 1727204125.07618: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161/AnsiballZ_command.py 13830 1727204125.07693: Sending initial data 13830 1727204125.07696: Sent initial data (156 bytes) 13830 1727204125.08706: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.08722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.08740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.08759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.08806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.08820: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.08838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.08856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.08872: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.08884: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.08896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.08912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.08931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.08947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.08960: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.08977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.09179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.09203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.09222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.09304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.11231: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204125.11299: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204125.11334: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp66x8xfcy /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161/AnsiballZ_command.py <<< 13830 1727204125.11349: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204125.12848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.13068: stderr chunk (state=3): >>><<< 13830 1727204125.13078: stdout chunk (state=3): >>><<< 13830 1727204125.13138: done transferring module to remote 13830 1727204125.13141: _low_level_execute_command(): starting 13830 1727204125.13180: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161/ /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161/AnsiballZ_command.py && sleep 0' 13830 1727204125.14766: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.14775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.14846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.14861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204125.14866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.15050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.15106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.16768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.16854: stderr chunk (state=3): >>><<< 13830 1727204125.16857: stdout chunk (state=3): >>><<< 13830 1727204125.16884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.16887: _low_level_execute_command(): starting 13830 1727204125.16892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161/AnsiballZ_command.py && sleep 0' 13830 1727204125.17911: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.17915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.17917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.17919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.17922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.17924: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.17926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.17927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.17930: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.17932: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.17933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.17935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.17937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.17939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.17941: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.17943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.17945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.17965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.17974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.18316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.31709: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-24 14:55:25.313017", "end": "2024-09-24 14:55:25.315945", "delta": "0:00:00.002928", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204125.32874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204125.32937: stderr chunk (state=3): >>><<< 13830 1727204125.32957: stdout chunk (state=3): >>><<< 13830 1727204125.32970: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-24 14:55:25.313017", "end": "2024-09-24 14:55:25.315945", "delta": "0:00:00.002928", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204125.32998: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_ip_target', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204125.33005: _low_level_execute_command(): starting 13830 1727204125.33007: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204125.0322478-17990-116739612725161/ > /dev/null 2>&1 && sleep 0' 13830 1727204125.34282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.34325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.34357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.34413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.34539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.34552: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.34570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.34602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.34635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.34652: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.34667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.34681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.34695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.34725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.34789: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.34820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.34928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.34945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.34962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.35028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.36871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.36902: stderr chunk (state=3): >>><<< 13830 1727204125.36906: stdout chunk (state=3): >>><<< 13830 1727204125.36970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.36985: handler run complete 13830 1727204125.37077: Evaluated conditional (False): False 13830 1727204125.37229: variable 'bond_opt' from source: unknown 13830 1727204125.37257: variable 'result' from source: set_fact 13830 1727204125.37278: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204125.37302: attempt loop complete, returning result 13830 1727204125.37342: variable 'bond_opt' from source: unknown 13830 1727204125.37491: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'arp_ip_target', 'value': '192.0.2.128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_ip_target", "value": "192.0.2.128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_ip_target" ], "delta": "0:00:00.002928", "end": "2024-09-24 14:55:25.315945", "rc": 0, "start": "2024-09-24 14:55:25.313017" } STDOUT: 192.0.2.128 13830 1727204125.37821: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204125.37838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204125.37869: variable 'omit' from source: magic vars 13830 1727204125.38135: variable 'ansible_distribution_major_version' from source: facts 13830 1727204125.38182: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204125.38190: variable 'omit' from source: magic vars 13830 1727204125.38213: variable 'omit' from source: magic vars 13830 1727204125.38485: variable 'controller_device' from source: play vars 13830 1727204125.38494: variable 'bond_opt' from source: unknown 13830 1727204125.38517: variable 'omit' from source: magic vars 13830 1727204125.38550: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204125.38566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204125.38580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204125.38603: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204125.38611: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204125.38619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204125.38712: Set connection var ansible_connection to ssh 13830 1727204125.38727: Set connection var ansible_timeout to 10 13830 1727204125.38741: Set connection var ansible_shell_executable to /bin/sh 13830 1727204125.38754: Set connection var ansible_shell_type to sh 13830 1727204125.38768: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204125.38782: Set connection var ansible_pipelining to False 13830 1727204125.38811: variable 'ansible_shell_executable' from source: unknown 13830 1727204125.38820: variable 'ansible_connection' from source: unknown 13830 1727204125.38827: variable 'ansible_module_compression' from source: unknown 13830 1727204125.38836: variable 'ansible_shell_type' from source: unknown 13830 1727204125.38842: variable 'ansible_shell_executable' from source: unknown 13830 1727204125.38847: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204125.38858: variable 'ansible_pipelining' from source: unknown 13830 1727204125.38867: variable 'ansible_timeout' from source: unknown 13830 1727204125.38875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204125.39028: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204125.39031: variable 'omit' from source: magic vars 13830 1727204125.39034: starting attempt loop 13830 1727204125.39036: running the handler 13830 1727204125.39038: _low_level_execute_command(): starting 13830 1727204125.39045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204125.39521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.39534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.39552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204125.39636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204125.39650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.39663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.39720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.41277: stdout chunk (state=3): >>>/root <<< 13830 1727204125.41379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.41593: stderr chunk (state=3): >>><<< 13830 1727204125.41615: stdout chunk (state=3): >>><<< 13830 1727204125.41648: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.41665: _low_level_execute_command(): starting 13830 1727204125.41676: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295 `" && echo ansible-tmp-1727204125.4165285-17990-257362943983295="` echo /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295 `" ) && sleep 0' 13830 1727204125.42576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.42601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.42624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.42646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.42689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.42727: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.42752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.42772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.42786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.42798: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.42812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.42830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.42860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.42876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.42889: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.42947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.43119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.43142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.43185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.43404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.45200: stdout chunk (state=3): >>>ansible-tmp-1727204125.4165285-17990-257362943983295=/root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295 <<< 13830 1727204125.45310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.45404: stderr chunk (state=3): >>><<< 13830 1727204125.45417: stdout chunk (state=3): >>><<< 13830 1727204125.45576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204125.4165285-17990-257362943983295=/root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.45580: variable 'ansible_module_compression' from source: unknown 13830 1727204125.45582: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204125.45584: variable 'ansible_facts' from source: unknown 13830 1727204125.45612: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295/AnsiballZ_command.py 13830 1727204125.45750: Sending initial data 13830 1727204125.45754: Sent initial data (156 bytes) 13830 1727204125.46697: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.46705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.46754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.46757: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.46760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.46762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.46766: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.46814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.46817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.46827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.46885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.49987: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13830 1727204125.50002: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13830 1727204125.50044: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13830 1727204125.50048: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 13830 1727204125.50080: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 13830 1727204125.50101: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 13830 1727204125.50115: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 13830 1727204125.50142: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 13830 1727204125.50179: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204125.50241: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 13830 1727204125.50250: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 13830 1727204125.50263: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 13830 1727204125.50321: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpoxfiecak /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295/AnsiballZ_command.py <<< 13830 1727204125.50414: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204125.51773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.51879: stderr chunk (state=3): >>><<< 13830 1727204125.51887: stdout chunk (state=3): >>><<< 13830 1727204125.51912: done transferring module to remote 13830 1727204125.51915: _low_level_execute_command(): starting 13830 1727204125.51920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295/ /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295/AnsiballZ_command.py && sleep 0' 13830 1727204125.52397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.52406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.52414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.52424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.52455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.52461: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.52471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.52485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.52489: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.52498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.52507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.52512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.52563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.52588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.52647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.54332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.54392: stderr chunk (state=3): >>><<< 13830 1727204125.54395: stdout chunk (state=3): >>><<< 13830 1727204125.54410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.54414: _low_level_execute_command(): starting 13830 1727204125.54425: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295/AnsiballZ_command.py && sleep 0' 13830 1727204125.55024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.55040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.55056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.55089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.55133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.55146: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.55167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.55193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.55206: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.55217: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.55229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.55243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.55259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.55272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.55288: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.55306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.55383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.55404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.55423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.55509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.69281: stdout chunk (state=3): >>> {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-24 14:55:25.688912", "end": "2024-09-24 14:55:25.691806", "delta": "0:00:00.002894", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204125.70536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204125.70541: stdout chunk (state=3): >>><<< 13830 1727204125.70543: stderr chunk (state=3): >>><<< 13830 1727204125.70689: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-24 14:55:25.688912", "end": "2024-09-24 14:55:25.691806", "delta": "0:00:00.002894", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204125.70693: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_validate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204125.70695: _low_level_execute_command(): starting 13830 1727204125.70697: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204125.4165285-17990-257362943983295/ > /dev/null 2>&1 && sleep 0' 13830 1727204125.71596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.71610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.71625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.71645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.71700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.71712: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.71725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.71745: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.71756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.71772: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.71788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.71801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.71816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.71829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.71843: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.71856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.71945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.71971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.71995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.72072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.73917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.73920: stdout chunk (state=3): >>><<< 13830 1727204125.73923: stderr chunk (state=3): >>><<< 13830 1727204125.73969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.73977: handler run complete 13830 1727204125.74076: Evaluated conditional (False): False 13830 1727204125.74162: variable 'bond_opt' from source: unknown 13830 1727204125.74176: variable 'result' from source: set_fact 13830 1727204125.74200: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204125.74216: attempt loop complete, returning result 13830 1727204125.74244: variable 'bond_opt' from source: unknown 13830 1727204125.74337: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'arp_validate', 'value': 'none'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_validate", "value": "none" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_validate" ], "delta": "0:00:00.002894", "end": "2024-09-24 14:55:25.691806", "rc": 0, "start": "2024-09-24 14:55:25.688912" } STDOUT: none 0 13830 1727204125.74594: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204125.74605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204125.74617: variable 'omit' from source: magic vars 13830 1727204125.74770: variable 'ansible_distribution_major_version' from source: facts 13830 1727204125.74780: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204125.74788: variable 'omit' from source: magic vars 13830 1727204125.74818: variable 'omit' from source: magic vars 13830 1727204125.75024: variable 'controller_device' from source: play vars 13830 1727204125.75037: variable 'bond_opt' from source: unknown 13830 1727204125.75068: variable 'omit' from source: magic vars 13830 1727204125.75095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204125.75109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204125.75122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204125.75177: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204125.75185: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204125.75193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204125.75318: Set connection var ansible_connection to ssh 13830 1727204125.75335: Set connection var ansible_timeout to 10 13830 1727204125.75355: Set connection var ansible_shell_executable to /bin/sh 13830 1727204125.75362: Set connection var ansible_shell_type to sh 13830 1727204125.75375: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204125.75392: Set connection var ansible_pipelining to False 13830 1727204125.75438: variable 'ansible_shell_executable' from source: unknown 13830 1727204125.75447: variable 'ansible_connection' from source: unknown 13830 1727204125.75461: variable 'ansible_module_compression' from source: unknown 13830 1727204125.75470: variable 'ansible_shell_type' from source: unknown 13830 1727204125.75478: variable 'ansible_shell_executable' from source: unknown 13830 1727204125.75484: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204125.75493: variable 'ansible_pipelining' from source: unknown 13830 1727204125.75502: variable 'ansible_timeout' from source: unknown 13830 1727204125.75509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204125.75628: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204125.75645: variable 'omit' from source: magic vars 13830 1727204125.75654: starting attempt loop 13830 1727204125.75661: running the handler 13830 1727204125.75679: _low_level_execute_command(): starting 13830 1727204125.75690: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204125.77041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.77057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.77079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.77110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.77197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.77217: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.77231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.77251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.77270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.77285: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.77301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.77327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.77350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.77363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.77382: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.77401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.77606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.77746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.77770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.77848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.79377: stdout chunk (state=3): >>>/root <<< 13830 1727204125.79490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.79537: stderr chunk (state=3): >>><<< 13830 1727204125.79541: stdout chunk (state=3): >>><<< 13830 1727204125.79553: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.79562: _low_level_execute_command(): starting 13830 1727204125.79571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647 `" && echo ansible-tmp-1727204125.7955394-17990-184094815217647="` echo /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647 `" ) && sleep 0' 13830 1727204125.80002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.80007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.80057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.80061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204125.80077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.80128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.80137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.80231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.82024: stdout chunk (state=3): >>>ansible-tmp-1727204125.7955394-17990-184094815217647=/root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647 <<< 13830 1727204125.82144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.82196: stderr chunk (state=3): >>><<< 13830 1727204125.82200: stdout chunk (state=3): >>><<< 13830 1727204125.82236: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204125.7955394-17990-184094815217647=/root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.82240: variable 'ansible_module_compression' from source: unknown 13830 1727204125.82268: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204125.82341: variable 'ansible_facts' from source: unknown 13830 1727204125.82460: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647/AnsiballZ_command.py 13830 1727204125.82971: Sending initial data 13830 1727204125.82977: Sent initial data (156 bytes) 13830 1727204125.84918: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.84926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.84942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.84960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.85222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.85226: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.85228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.85230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.85232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.85234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.85236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.85238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.85240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.85242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.85244: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.85246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.85272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.85275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.85413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.85416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.87075: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204125.87080: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204125.87115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmptfjdiqm9 /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647/AnsiballZ_command.py <<< 13830 1727204125.87150: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204125.88280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.88284: stderr chunk (state=3): >>><<< 13830 1727204125.88286: stdout chunk (state=3): >>><<< 13830 1727204125.88288: done transferring module to remote 13830 1727204125.88292: _low_level_execute_command(): starting 13830 1727204125.88294: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647/ /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647/AnsiballZ_command.py && sleep 0' 13830 1727204125.88916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.88945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.88960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.88982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.89110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.89122: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.89139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.89157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.89172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.89195: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.89209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.89225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.89246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.89259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.89276: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.89293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.89381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.89406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.89436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.89552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204125.91305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204125.91308: stdout chunk (state=3): >>><<< 13830 1727204125.91312: stderr chunk (state=3): >>><<< 13830 1727204125.91413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204125.91417: _low_level_execute_command(): starting 13830 1727204125.91422: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647/AnsiballZ_command.py && sleep 0' 13830 1727204125.92400: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204125.92409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.92421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.92437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.92473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.92480: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204125.92490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.92506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204125.92509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204125.92515: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204125.92523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204125.92532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204125.92543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204125.92579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204125.92586: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204125.92597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204125.92667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204125.92885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204125.92895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204125.93140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.06722: stdout chunk (state=3): >>> {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-24 14:55:26.063395", "end": "2024-09-24 14:55:26.066312", "delta": "0:00:00.002917", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204126.07872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204126.07924: stderr chunk (state=3): >>><<< 13830 1727204126.07928: stdout chunk (state=3): >>><<< 13830 1727204126.07945: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-24 14:55:26.063395", "end": "2024-09-24 14:55:26.066312", "delta": "0:00:00.002917", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204126.07968: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/primary', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204126.07974: _low_level_execute_command(): starting 13830 1727204126.07978: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204125.7955394-17990-184094815217647/ > /dev/null 2>&1 && sleep 0' 13830 1727204126.08447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.08451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.08483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.08487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.08497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.08502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.08509: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204126.08516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.08579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204126.08586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204126.08594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204126.08653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.11253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204126.11257: stdout chunk (state=3): >>><<< 13830 1727204126.11262: stderr chunk (state=3): >>><<< 13830 1727204126.11281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204126.11287: handler run complete 13830 1727204126.11309: Evaluated conditional (False): False 13830 1727204126.11473: variable 'bond_opt' from source: unknown 13830 1727204126.11478: variable 'result' from source: set_fact 13830 1727204126.11494: Evaluated conditional (bond_opt.value in result.stdout): True 13830 1727204126.11505: attempt loop complete, returning result 13830 1727204126.11523: variable 'bond_opt' from source: unknown 13830 1727204126.11592: variable 'bond_opt' from source: unknown ok: [managed-node3] => (item={'key': 'primary', 'value': 'test1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "primary", "value": "test1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/primary" ], "delta": "0:00:00.002917", "end": "2024-09-24 14:55:26.066312", "rc": 0, "start": "2024-09-24 14:55:26.063395" } STDOUT: test1 13830 1727204126.11747: dumping result to json 13830 1727204126.11750: done dumping result, returning 13830 1727204126.11752: done running TaskExecutor() for managed-node3/TASK: ** TEST check bond settings [0affcd87-79f5-1659-6b02-000000000c2a] 13830 1727204126.11755: sending task result for task 0affcd87-79f5-1659-6b02-000000000c2a 13830 1727204126.11811: done sending task result for task 0affcd87-79f5-1659-6b02-000000000c2a 13830 1727204126.11814: WORKER PROCESS EXITING 13830 1727204126.11958: no more pending results, returning what we have 13830 1727204126.11962: results queue empty 13830 1727204126.11963: checking for any_errors_fatal 13830 1727204126.11966: done checking for any_errors_fatal 13830 1727204126.11967: checking for max_fail_percentage 13830 1727204126.11968: done checking for max_fail_percentage 13830 1727204126.11969: checking to see if all hosts have failed and the running result is not ok 13830 1727204126.11970: done checking to see if all hosts have failed 13830 1727204126.11971: getting the remaining hosts for this loop 13830 1727204126.11972: done getting the remaining hosts for this loop 13830 1727204126.11976: getting the next task for host managed-node3 13830 1727204126.11981: done getting next task for host managed-node3 13830 1727204126.11983: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 13830 1727204126.11987: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204126.11990: getting variables 13830 1727204126.11991: in VariableManager get_vars() 13830 1727204126.12031: Calling all_inventory to load vars for managed-node3 13830 1727204126.12034: Calling groups_inventory to load vars for managed-node3 13830 1727204126.12036: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204126.12046: Calling all_plugins_play to load vars for managed-node3 13830 1727204126.12048: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204126.12051: Calling groups_plugins_play to load vars for managed-node3 13830 1727204126.15079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204126.21574: done with get_vars() 13830 1727204126.21616: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Tuesday 24 September 2024 14:55:26 -0400 (0:00:02.206) 0:00:59.297 ***** 13830 1727204126.21947: entering _queue_task() for managed-node3/include_tasks 13830 1727204126.23216: worker is 1 (out of 1 available) 13830 1727204126.23230: exiting _queue_task() for managed-node3/include_tasks 13830 1727204126.23242: done queuing things up, now waiting for results queue to drain 13830 1727204126.23244: waiting for pending results... 13830 1727204126.23763: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv4_present.yml' 13830 1727204126.23984: in run() - task 0affcd87-79f5-1659-6b02-000000000c2c 13830 1727204126.24091: variable 'ansible_search_path' from source: unknown 13830 1727204126.24278: variable 'ansible_search_path' from source: unknown 13830 1727204126.24320: calling self._execute() 13830 1727204126.24445: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204126.24458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204126.24482: variable 'omit' from source: magic vars 13830 1727204126.25378: variable 'ansible_distribution_major_version' from source: facts 13830 1727204126.25400: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204126.25412: _execute() done 13830 1727204126.25421: dumping result to json 13830 1727204126.25429: done dumping result, returning 13830 1727204126.25445: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv4_present.yml' [0affcd87-79f5-1659-6b02-000000000c2c] 13830 1727204126.25456: sending task result for task 0affcd87-79f5-1659-6b02-000000000c2c 13830 1727204126.25609: no more pending results, returning what we have 13830 1727204126.25617: in VariableManager get_vars() 13830 1727204126.25681: Calling all_inventory to load vars for managed-node3 13830 1727204126.25687: Calling groups_inventory to load vars for managed-node3 13830 1727204126.25689: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204126.25705: Calling all_plugins_play to load vars for managed-node3 13830 1727204126.25708: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204126.25711: Calling groups_plugins_play to load vars for managed-node3 13830 1727204126.26230: done sending task result for task 0affcd87-79f5-1659-6b02-000000000c2c 13830 1727204126.26236: WORKER PROCESS EXITING 13830 1727204126.29063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204126.33738: done with get_vars() 13830 1727204126.33776: variable 'ansible_search_path' from source: unknown 13830 1727204126.33778: variable 'ansible_search_path' from source: unknown 13830 1727204126.33789: variable 'item' from source: include params 13830 1727204126.33907: variable 'item' from source: include params 13830 1727204126.33948: we have included files to process 13830 1727204126.33950: generating all_blocks data 13830 1727204126.33952: done generating all_blocks data 13830 1727204126.33958: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 13830 1727204126.33959: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 13830 1727204126.33962: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 13830 1727204126.34372: done processing included file 13830 1727204126.34375: iterating over new_blocks loaded from include file 13830 1727204126.34376: in VariableManager get_vars() 13830 1727204126.34403: done with get_vars() 13830 1727204126.34405: filtering new block on tags 13830 1727204126.34438: done filtering new block on tags 13830 1727204126.34441: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed-node3 13830 1727204126.34446: extending task lists for all hosts with included blocks 13830 1727204126.34889: done extending task lists 13830 1727204126.34890: done processing included files 13830 1727204126.34891: results queue empty 13830 1727204126.34892: checking for any_errors_fatal 13830 1727204126.34901: done checking for any_errors_fatal 13830 1727204126.34902: checking for max_fail_percentage 13830 1727204126.34903: done checking for max_fail_percentage 13830 1727204126.34904: checking to see if all hosts have failed and the running result is not ok 13830 1727204126.34904: done checking to see if all hosts have failed 13830 1727204126.34905: getting the remaining hosts for this loop 13830 1727204126.34906: done getting the remaining hosts for this loop 13830 1727204126.34909: getting the next task for host managed-node3 13830 1727204126.34913: done getting next task for host managed-node3 13830 1727204126.34916: ^ task is: TASK: ** TEST check IPv4 13830 1727204126.34919: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204126.34922: getting variables 13830 1727204126.34923: in VariableManager get_vars() 13830 1727204126.34940: Calling all_inventory to load vars for managed-node3 13830 1727204126.34942: Calling groups_inventory to load vars for managed-node3 13830 1727204126.34944: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204126.34950: Calling all_plugins_play to load vars for managed-node3 13830 1727204126.34952: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204126.34955: Calling groups_plugins_play to load vars for managed-node3 13830 1727204126.36997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204126.39516: done with get_vars() 13830 1727204126.39556: done getting variables 13830 1727204126.39605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.176) 0:00:59.474 ***** 13830 1727204126.39641: entering _queue_task() for managed-node3/command 13830 1727204126.39992: worker is 1 (out of 1 available) 13830 1727204126.40005: exiting _queue_task() for managed-node3/command 13830 1727204126.40019: done queuing things up, now waiting for results queue to drain 13830 1727204126.40021: waiting for pending results... 13830 1727204126.41120: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 13830 1727204126.41891: in run() - task 0affcd87-79f5-1659-6b02-000000000da6 13830 1727204126.41922: variable 'ansible_search_path' from source: unknown 13830 1727204126.41931: variable 'ansible_search_path' from source: unknown 13830 1727204126.41983: calling self._execute() 13830 1727204126.42102: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204126.42114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204126.42131: variable 'omit' from source: magic vars 13830 1727204126.42756: variable 'ansible_distribution_major_version' from source: facts 13830 1727204126.42785: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204126.42798: variable 'omit' from source: magic vars 13830 1727204126.42861: variable 'omit' from source: magic vars 13830 1727204126.43044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204126.47438: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204126.47523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204126.47571: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204126.47727: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204126.47762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204126.47871: variable 'interface' from source: include params 13830 1727204126.48046: variable 'controller_device' from source: play vars 13830 1727204126.48125: variable 'controller_device' from source: play vars 13830 1727204126.48168: variable 'omit' from source: magic vars 13830 1727204126.48294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204126.48327: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204126.48386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204126.48492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204126.48509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204126.48556: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204126.48591: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204126.48600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204126.48774: Set connection var ansible_connection to ssh 13830 1727204126.48896: Set connection var ansible_timeout to 10 13830 1727204126.48914: Set connection var ansible_shell_executable to /bin/sh 13830 1727204126.48921: Set connection var ansible_shell_type to sh 13830 1727204126.48930: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204126.48945: Set connection var ansible_pipelining to False 13830 1727204126.48977: variable 'ansible_shell_executable' from source: unknown 13830 1727204126.49030: variable 'ansible_connection' from source: unknown 13830 1727204126.49041: variable 'ansible_module_compression' from source: unknown 13830 1727204126.49078: variable 'ansible_shell_type' from source: unknown 13830 1727204126.49086: variable 'ansible_shell_executable' from source: unknown 13830 1727204126.49093: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204126.49131: variable 'ansible_pipelining' from source: unknown 13830 1727204126.49141: variable 'ansible_timeout' from source: unknown 13830 1727204126.49149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204126.49347: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204126.49369: variable 'omit' from source: magic vars 13830 1727204126.49381: starting attempt loop 13830 1727204126.49395: running the handler 13830 1727204126.49420: _low_level_execute_command(): starting 13830 1727204126.49430: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204126.50228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204126.50247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.50262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.50286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.50326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.50347: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204126.50361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.50381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204126.50392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204126.50403: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204126.50417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.50430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.50454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.50468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.50479: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204126.50492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.50569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204126.50586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204126.50600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204126.50684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.52615: stdout chunk (state=3): >>>/root <<< 13830 1727204126.52871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204126.52876: stdout chunk (state=3): >>><<< 13830 1727204126.52879: stderr chunk (state=3): >>><<< 13830 1727204126.52883: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204126.52895: _low_level_execute_command(): starting 13830 1727204126.52899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102 `" && echo ansible-tmp-1727204126.5279355-18114-39841833056102="` echo /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102 `" ) && sleep 0' 13830 1727204126.54019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204126.54043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.54062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.54089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.54160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.54178: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204126.54194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.54214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204126.54228: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204126.54244: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204126.54256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.54283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.54298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.54309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.54320: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204126.54335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.54422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204126.54448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204126.54465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204126.54548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.56376: stdout chunk (state=3): >>>ansible-tmp-1727204126.5279355-18114-39841833056102=/root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102 <<< 13830 1727204126.56579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204126.56583: stdout chunk (state=3): >>><<< 13830 1727204126.56586: stderr chunk (state=3): >>><<< 13830 1727204126.56919: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204126.5279355-18114-39841833056102=/root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204126.56923: variable 'ansible_module_compression' from source: unknown 13830 1727204126.56926: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204126.56929: variable 'ansible_facts' from source: unknown 13830 1727204126.56932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102/AnsiballZ_command.py 13830 1727204126.56998: Sending initial data 13830 1727204126.57001: Sent initial data (155 bytes) 13830 1727204126.58140: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204126.58185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.58199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.58218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.58266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.58281: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204126.58296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.58315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204126.58327: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204126.58341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204126.58353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.58368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.58383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.58394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.58403: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204126.58416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.58497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204126.58520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204126.58540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204126.58611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.60326: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204126.60367: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204126.60410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpsb1j9r96 /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102/AnsiballZ_command.py <<< 13830 1727204126.60462: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204126.61691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204126.61899: stderr chunk (state=3): >>><<< 13830 1727204126.61903: stdout chunk (state=3): >>><<< 13830 1727204126.61905: done transferring module to remote 13830 1727204126.61907: _low_level_execute_command(): starting 13830 1727204126.61909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102/ /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102/AnsiballZ_command.py && sleep 0' 13830 1727204126.63926: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204126.63961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.63981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.64016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.64123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.64138: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204126.64157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.64181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204126.64192: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204126.64202: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204126.64212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.64224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.64241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.64252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.64267: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204126.64285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.64368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204126.64436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204126.64552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204126.64729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.66494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204126.66861: stderr chunk (state=3): >>><<< 13830 1727204126.66918: stdout chunk (state=3): >>><<< 13830 1727204126.66970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204126.67066: _low_level_execute_command(): starting 13830 1727204126.67071: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102/AnsiballZ_command.py && sleep 0' 13830 1727204126.69454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204126.69495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.69514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.69537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.69607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.69643: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204126.69657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.69680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204126.69692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204126.69705: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204126.69717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.69730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.69751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.69776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.69826: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204126.69846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.69935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204126.69962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204126.69982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204126.70085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.83994: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.225/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 239sec preferred_lft 239sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:26.835493", "end": "2024-09-24 14:55:26.838865", "delta": "0:00:00.003372", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204126.85189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204126.85249: stderr chunk (state=3): >>><<< 13830 1727204126.85252: stdout chunk (state=3): >>><<< 13830 1727204126.85328: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.225/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 239sec preferred_lft 239sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:26.835493", "end": "2024-09-24 14:55:26.838865", "delta": "0:00:00.003372", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204126.85376: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204126.85380: _low_level_execute_command(): starting 13830 1727204126.85382: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204126.5279355-18114-39841833056102/ > /dev/null 2>&1 && sleep 0' 13830 1727204126.86223: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204126.86227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.86239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.86251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.86344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.86349: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204126.86358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.86393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204126.86399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204126.86402: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204126.86442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204126.86445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204126.86456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204126.86467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204126.86472: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204126.86481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204126.86604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204126.86627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204126.86647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204126.86734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204126.88603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204126.88854: stderr chunk (state=3): >>><<< 13830 1727204126.88867: stdout chunk (state=3): >>><<< 13830 1727204126.88898: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204126.88916: handler run complete 13830 1727204126.88955: Evaluated conditional (False): False 13830 1727204126.89121: variable 'address' from source: include params 13830 1727204126.89125: variable 'result' from source: set_fact 13830 1727204126.89145: Evaluated conditional (address in result.stdout): True 13830 1727204126.89156: attempt loop complete, returning result 13830 1727204126.89159: _execute() done 13830 1727204126.89161: dumping result to json 13830 1727204126.89168: done dumping result, returning 13830 1727204126.89177: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 [0affcd87-79f5-1659-6b02-000000000da6] 13830 1727204126.89182: sending task result for task 0affcd87-79f5-1659-6b02-000000000da6 13830 1727204126.89383: done sending task result for task 0affcd87-79f5-1659-6b02-000000000da6 13830 1727204126.89387: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003372", "end": "2024-09-24 14:55:26.838865", "rc": 0, "start": "2024-09-24 14:55:26.835493" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.225/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 239sec preferred_lft 239sec 13830 1727204126.89499: no more pending results, returning what we have 13830 1727204126.89505: results queue empty 13830 1727204126.89506: checking for any_errors_fatal 13830 1727204126.89508: done checking for any_errors_fatal 13830 1727204126.89509: checking for max_fail_percentage 13830 1727204126.89512: done checking for max_fail_percentage 13830 1727204126.89513: checking to see if all hosts have failed and the running result is not ok 13830 1727204126.89514: done checking to see if all hosts have failed 13830 1727204126.89514: getting the remaining hosts for this loop 13830 1727204126.89517: done getting the remaining hosts for this loop 13830 1727204126.89522: getting the next task for host managed-node3 13830 1727204126.89533: done getting next task for host managed-node3 13830 1727204126.89538: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 13830 1727204126.89544: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204126.89549: getting variables 13830 1727204126.89551: in VariableManager get_vars() 13830 1727204126.89618: Calling all_inventory to load vars for managed-node3 13830 1727204126.89623: Calling groups_inventory to load vars for managed-node3 13830 1727204126.89629: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204126.89647: Calling all_plugins_play to load vars for managed-node3 13830 1727204126.89652: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204126.89659: Calling groups_plugins_play to load vars for managed-node3 13830 1727204126.92448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204126.95723: done with get_vars() 13830 1727204126.95771: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.562) 0:01:00.036 ***** 13830 1727204126.95887: entering _queue_task() for managed-node3/include_tasks 13830 1727204126.96285: worker is 1 (out of 1 available) 13830 1727204126.96297: exiting _queue_task() for managed-node3/include_tasks 13830 1727204126.96315: done queuing things up, now waiting for results queue to drain 13830 1727204126.96317: waiting for pending results... 13830 1727204126.96689: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv6_present.yml' 13830 1727204126.96837: in run() - task 0affcd87-79f5-1659-6b02-000000000c2d 13830 1727204126.96861: variable 'ansible_search_path' from source: unknown 13830 1727204126.96877: variable 'ansible_search_path' from source: unknown 13830 1727204126.96921: calling self._execute() 13830 1727204126.97039: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204126.97055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204126.97075: variable 'omit' from source: magic vars 13830 1727204126.98487: variable 'ansible_distribution_major_version' from source: facts 13830 1727204126.98584: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204126.98623: _execute() done 13830 1727204126.98730: dumping result to json 13830 1727204126.98740: done dumping result, returning 13830 1727204126.98752: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_IPv6_present.yml' [0affcd87-79f5-1659-6b02-000000000c2d] 13830 1727204126.98766: sending task result for task 0affcd87-79f5-1659-6b02-000000000c2d 13830 1727204126.98935: no more pending results, returning what we have 13830 1727204126.98941: in VariableManager get_vars() 13830 1727204126.99002: Calling all_inventory to load vars for managed-node3 13830 1727204126.99006: Calling groups_inventory to load vars for managed-node3 13830 1727204126.99008: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204126.99024: Calling all_plugins_play to load vars for managed-node3 13830 1727204126.99027: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204126.99030: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.00121: done sending task result for task 0affcd87-79f5-1659-6b02-000000000c2d 13830 1727204127.00125: WORKER PROCESS EXITING 13830 1727204127.00872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.03134: done with get_vars() 13830 1727204127.03162: variable 'ansible_search_path' from source: unknown 13830 1727204127.03166: variable 'ansible_search_path' from source: unknown 13830 1727204127.03176: variable 'item' from source: include params 13830 1727204127.03302: variable 'item' from source: include params 13830 1727204127.03337: we have included files to process 13830 1727204127.03339: generating all_blocks data 13830 1727204127.03341: done generating all_blocks data 13830 1727204127.03347: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 13830 1727204127.03348: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 13830 1727204127.03350: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 13830 1727204127.03560: done processing included file 13830 1727204127.03562: iterating over new_blocks loaded from include file 13830 1727204127.03565: in VariableManager get_vars() 13830 1727204127.03589: done with get_vars() 13830 1727204127.03590: filtering new block on tags 13830 1727204127.03628: done filtering new block on tags 13830 1727204127.03630: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed-node3 13830 1727204127.03635: extending task lists for all hosts with included blocks 13830 1727204127.03980: done extending task lists 13830 1727204127.03982: done processing included files 13830 1727204127.03982: results queue empty 13830 1727204127.03983: checking for any_errors_fatal 13830 1727204127.03988: done checking for any_errors_fatal 13830 1727204127.03988: checking for max_fail_percentage 13830 1727204127.03989: done checking for max_fail_percentage 13830 1727204127.03990: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.03991: done checking to see if all hosts have failed 13830 1727204127.03992: getting the remaining hosts for this loop 13830 1727204127.03993: done getting the remaining hosts for this loop 13830 1727204127.03995: getting the next task for host managed-node3 13830 1727204127.04000: done getting next task for host managed-node3 13830 1727204127.04001: ^ task is: TASK: ** TEST check IPv6 13830 1727204127.04005: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.04007: getting variables 13830 1727204127.04008: in VariableManager get_vars() 13830 1727204127.04024: Calling all_inventory to load vars for managed-node3 13830 1727204127.04030: Calling groups_inventory to load vars for managed-node3 13830 1727204127.04032: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.04038: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.04040: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.04043: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.05563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.07491: done with get_vars() 13830 1727204127.07520: done getting variables 13830 1727204127.07577: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.117) 0:01:00.154 ***** 13830 1727204127.07610: entering _queue_task() for managed-node3/command 13830 1727204127.07990: worker is 1 (out of 1 available) 13830 1727204127.08002: exiting _queue_task() for managed-node3/command 13830 1727204127.08016: done queuing things up, now waiting for results queue to drain 13830 1727204127.08018: waiting for pending results... 13830 1727204127.08353: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 13830 1727204127.08739: in run() - task 0affcd87-79f5-1659-6b02-000000000dc7 13830 1727204127.08771: variable 'ansible_search_path' from source: unknown 13830 1727204127.08780: variable 'ansible_search_path' from source: unknown 13830 1727204127.08822: calling self._execute() 13830 1727204127.08970: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.08999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.09015: variable 'omit' from source: magic vars 13830 1727204127.09432: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.09450: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.09460: variable 'omit' from source: magic vars 13830 1727204127.09527: variable 'omit' from source: magic vars 13830 1727204127.09747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204127.12548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204127.12630: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204127.12680: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204127.12725: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204127.12754: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204127.12851: variable 'controller_device' from source: play vars 13830 1727204127.12881: variable 'omit' from source: magic vars 13830 1727204127.12919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204127.12957: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204127.12985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204127.13005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204127.13019: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204127.13062: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204127.13073: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.13080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.13186: Set connection var ansible_connection to ssh 13830 1727204127.13202: Set connection var ansible_timeout to 10 13830 1727204127.13212: Set connection var ansible_shell_executable to /bin/sh 13830 1727204127.13218: Set connection var ansible_shell_type to sh 13830 1727204127.13226: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204127.13239: Set connection var ansible_pipelining to False 13830 1727204127.13277: variable 'ansible_shell_executable' from source: unknown 13830 1727204127.13284: variable 'ansible_connection' from source: unknown 13830 1727204127.13291: variable 'ansible_module_compression' from source: unknown 13830 1727204127.13298: variable 'ansible_shell_type' from source: unknown 13830 1727204127.13304: variable 'ansible_shell_executable' from source: unknown 13830 1727204127.13310: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.13318: variable 'ansible_pipelining' from source: unknown 13830 1727204127.13324: variable 'ansible_timeout' from source: unknown 13830 1727204127.13331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.13441: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204127.13534: variable 'omit' from source: magic vars 13830 1727204127.13544: starting attempt loop 13830 1727204127.13551: running the handler 13830 1727204127.13572: _low_level_execute_command(): starting 13830 1727204127.13583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204127.14392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204127.14408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.14424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.14444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.14496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.14509: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204127.14524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.14545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204127.14559: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204127.14576: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204127.14596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.14611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.14627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.14641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.14653: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204127.14670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.14752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204127.14779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204127.14804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204127.14882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204127.16497: stdout chunk (state=3): >>>/root <<< 13830 1727204127.16694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204127.16697: stdout chunk (state=3): >>><<< 13830 1727204127.16699: stderr chunk (state=3): >>><<< 13830 1727204127.16812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204127.16824: _low_level_execute_command(): starting 13830 1727204127.16827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671 `" && echo ansible-tmp-1727204127.1672077-18150-68560265369671="` echo /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671 `" ) && sleep 0' 13830 1727204127.17381: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204127.17397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.17412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.17431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.17477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.17489: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204127.17504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.17521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204127.17537: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204127.17549: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204127.17562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.17583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.17600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.17614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.17626: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204127.17641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.17717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204127.17739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204127.17756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204127.17832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204127.20266: stdout chunk (state=3): >>>ansible-tmp-1727204127.1672077-18150-68560265369671=/root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671 <<< 13830 1727204127.20384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204127.20486: stderr chunk (state=3): >>><<< 13830 1727204127.20489: stdout chunk (state=3): >>><<< 13830 1727204127.20838: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204127.1672077-18150-68560265369671=/root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204127.20843: variable 'ansible_module_compression' from source: unknown 13830 1727204127.20846: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204127.20849: variable 'ansible_facts' from source: unknown 13830 1727204127.20852: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671/AnsiballZ_command.py 13830 1727204127.20917: Sending initial data 13830 1727204127.20921: Sent initial data (155 bytes) 13830 1727204127.21934: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204127.21949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.21966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.21986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.22031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.22050: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204127.22067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.22087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204127.22100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204127.22111: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204127.22124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.22137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.22158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.22173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.22185: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204127.22198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.22281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204127.22298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204127.22313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204127.22501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204127.24132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204127.24171: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204127.24219: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpckjdl2v0 /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671/AnsiballZ_command.py <<< 13830 1727204127.24254: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204127.25397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204127.25600: stderr chunk (state=3): >>><<< 13830 1727204127.25604: stdout chunk (state=3): >>><<< 13830 1727204127.25606: done transferring module to remote 13830 1727204127.25608: _low_level_execute_command(): starting 13830 1727204127.25611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671/ /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671/AnsiballZ_command.py && sleep 0' 13830 1727204127.26247: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204127.26269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.26285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.26304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.26348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.26371: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204127.26388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.26407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204127.26421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204127.26434: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204127.26447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.26461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.26488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.26500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.26509: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204127.26521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.26601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204127.26622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204127.26638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204127.26712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204127.28463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204127.28468: stdout chunk (state=3): >>><<< 13830 1727204127.28471: stderr chunk (state=3): >>><<< 13830 1727204127.28561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204127.28568: _low_level_execute_command(): starting 13830 1727204127.28571: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671/AnsiballZ_command.py && sleep 0' 13830 1727204127.29195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204127.29209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.29224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.29241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.29285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.29296: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204127.29310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.29326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204127.29338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204127.29350: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204127.29363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.29380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.29396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.29409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.29966: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204127.29983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.30058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204127.30078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204127.30093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204127.30172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204127.43873: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1aa/128 scope global dynamic noprefixroute \n valid_lft 239sec preferred_lft 239sec\n inet6 2001:db8::2f:2d03:83f0:a643/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::866e:64d9:d944:6bc0/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:27.434263", "end": "2024-09-24 14:55:27.437735", "delta": "0:00:00.003472", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204127.45128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204127.45133: stdout chunk (state=3): >>><<< 13830 1727204127.45135: stderr chunk (state=3): >>><<< 13830 1727204127.45277: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1aa/128 scope global dynamic noprefixroute \n valid_lft 239sec preferred_lft 239sec\n inet6 2001:db8::2f:2d03:83f0:a643/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::866e:64d9:d944:6bc0/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:27.434263", "end": "2024-09-24 14:55:27.437735", "delta": "0:00:00.003472", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204127.45288: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204127.45290: _low_level_execute_command(): starting 13830 1727204127.45292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204127.1672077-18150-68560265369671/ > /dev/null 2>&1 && sleep 0' 13830 1727204127.46102: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204127.46116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.46129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.46155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.46206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.46218: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204127.46231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.46248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204127.46270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204127.46291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204127.46304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204127.46317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204127.46331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204127.46343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204127.46353: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204127.46367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204127.46455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204127.46485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204127.46508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204127.46769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204127.48473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204127.48492: stderr chunk (state=3): >>><<< 13830 1727204127.48512: stdout chunk (state=3): >>><<< 13830 1727204127.48532: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204127.48535: handler run complete 13830 1727204127.48557: Evaluated conditional (False): False 13830 1727204127.48675: variable 'address' from source: include params 13830 1727204127.48679: variable 'result' from source: set_fact 13830 1727204127.48692: Evaluated conditional (address in result.stdout): True 13830 1727204127.48701: attempt loop complete, returning result 13830 1727204127.48704: _execute() done 13830 1727204127.48707: dumping result to json 13830 1727204127.48715: done dumping result, returning 13830 1727204127.48720: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 [0affcd87-79f5-1659-6b02-000000000dc7] 13830 1727204127.48725: sending task result for task 0affcd87-79f5-1659-6b02-000000000dc7 13830 1727204127.48828: done sending task result for task 0affcd87-79f5-1659-6b02-000000000dc7 13830 1727204127.48832: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003472", "end": "2024-09-24 14:55:27.437735", "rc": 0, "start": "2024-09-24 14:55:27.434263" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1aa/128 scope global dynamic noprefixroute valid_lft 239sec preferred_lft 239sec inet6 2001:db8::2f:2d03:83f0:a643/64 scope global dynamic noprefixroute valid_lft 1797sec preferred_lft 1797sec inet6 fe80::866e:64d9:d944:6bc0/64 scope link noprefixroute valid_lft forever preferred_lft forever 13830 1727204127.48919: no more pending results, returning what we have 13830 1727204127.48923: results queue empty 13830 1727204127.48924: checking for any_errors_fatal 13830 1727204127.48925: done checking for any_errors_fatal 13830 1727204127.48926: checking for max_fail_percentage 13830 1727204127.48928: done checking for max_fail_percentage 13830 1727204127.48929: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.48929: done checking to see if all hosts have failed 13830 1727204127.48930: getting the remaining hosts for this loop 13830 1727204127.48932: done getting the remaining hosts for this loop 13830 1727204127.48936: getting the next task for host managed-node3 13830 1727204127.48944: done getting next task for host managed-node3 13830 1727204127.48947: ^ task is: TASK: Conditional asserts 13830 1727204127.48951: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.48955: getting variables 13830 1727204127.48957: in VariableManager get_vars() 13830 1727204127.49002: Calling all_inventory to load vars for managed-node3 13830 1727204127.49004: Calling groups_inventory to load vars for managed-node3 13830 1727204127.49006: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.49020: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.49022: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.49025: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.49865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.51917: done with get_vars() 13830 1727204127.51950: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.444) 0:01:00.598 ***** 13830 1727204127.52083: entering _queue_task() for managed-node3/include_tasks 13830 1727204127.52451: worker is 1 (out of 1 available) 13830 1727204127.52466: exiting _queue_task() for managed-node3/include_tasks 13830 1727204127.52478: done queuing things up, now waiting for results queue to drain 13830 1727204127.52480: waiting for pending results... 13830 1727204127.52673: running TaskExecutor() for managed-node3/TASK: Conditional asserts 13830 1727204127.52749: in run() - task 0affcd87-79f5-1659-6b02-0000000008f0 13830 1727204127.52760: variable 'ansible_search_path' from source: unknown 13830 1727204127.52765: variable 'ansible_search_path' from source: unknown 13830 1727204127.52988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204127.54591: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204127.54645: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204127.54677: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204127.54704: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204127.54727: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204127.54795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204127.54813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204127.54831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204127.54860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204127.54872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204127.54987: dumping result to json 13830 1727204127.54990: done dumping result, returning 13830 1727204127.55000: done running TaskExecutor() for managed-node3/TASK: Conditional asserts [0affcd87-79f5-1659-6b02-0000000008f0] 13830 1727204127.55003: sending task result for task 0affcd87-79f5-1659-6b02-0000000008f0 13830 1727204127.55094: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008f0 13830 1727204127.55096: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "skipped_reason": "No items in the list" } 13830 1727204127.55157: no more pending results, returning what we have 13830 1727204127.55160: results queue empty 13830 1727204127.55161: checking for any_errors_fatal 13830 1727204127.55174: done checking for any_errors_fatal 13830 1727204127.55175: checking for max_fail_percentage 13830 1727204127.55177: done checking for max_fail_percentage 13830 1727204127.55178: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.55179: done checking to see if all hosts have failed 13830 1727204127.55180: getting the remaining hosts for this loop 13830 1727204127.55181: done getting the remaining hosts for this loop 13830 1727204127.55186: getting the next task for host managed-node3 13830 1727204127.55193: done getting next task for host managed-node3 13830 1727204127.55196: ^ task is: TASK: Success in test '{{ lsr_description }}' 13830 1727204127.55199: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.55204: getting variables 13830 1727204127.55205: in VariableManager get_vars() 13830 1727204127.55258: Calling all_inventory to load vars for managed-node3 13830 1727204127.55261: Calling groups_inventory to load vars for managed-node3 13830 1727204127.55263: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.55274: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.55276: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.55279: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.56101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.57036: done with get_vars() 13830 1727204127.57054: done getting variables 13830 1727204127.57101: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204127.57193: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.051) 0:01:00.650 ***** 13830 1727204127.57218: entering _queue_task() for managed-node3/debug 13830 1727204127.57466: worker is 1 (out of 1 available) 13830 1727204127.57480: exiting _queue_task() for managed-node3/debug 13830 1727204127.57492: done queuing things up, now waiting for results queue to drain 13830 1727204127.57493: waiting for pending results... 13830 1727204127.57687: running TaskExecutor() for managed-node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 13830 1727204127.57761: in run() - task 0affcd87-79f5-1659-6b02-0000000008f1 13830 1727204127.57774: variable 'ansible_search_path' from source: unknown 13830 1727204127.57778: variable 'ansible_search_path' from source: unknown 13830 1727204127.57806: calling self._execute() 13830 1727204127.57889: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.57893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.57901: variable 'omit' from source: magic vars 13830 1727204127.58172: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.58183: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.58189: variable 'omit' from source: magic vars 13830 1727204127.58216: variable 'omit' from source: magic vars 13830 1727204127.58288: variable 'lsr_description' from source: include params 13830 1727204127.58301: variable 'omit' from source: magic vars 13830 1727204127.58336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204127.58369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204127.58385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204127.58398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204127.58407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204127.58430: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204127.58433: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.58439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.58511: Set connection var ansible_connection to ssh 13830 1727204127.58520: Set connection var ansible_timeout to 10 13830 1727204127.58525: Set connection var ansible_shell_executable to /bin/sh 13830 1727204127.58527: Set connection var ansible_shell_type to sh 13830 1727204127.58532: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204127.58542: Set connection var ansible_pipelining to False 13830 1727204127.58559: variable 'ansible_shell_executable' from source: unknown 13830 1727204127.58562: variable 'ansible_connection' from source: unknown 13830 1727204127.58566: variable 'ansible_module_compression' from source: unknown 13830 1727204127.58568: variable 'ansible_shell_type' from source: unknown 13830 1727204127.58570: variable 'ansible_shell_executable' from source: unknown 13830 1727204127.58573: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.58577: variable 'ansible_pipelining' from source: unknown 13830 1727204127.58579: variable 'ansible_timeout' from source: unknown 13830 1727204127.58584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.58685: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204127.58696: variable 'omit' from source: magic vars 13830 1727204127.58699: starting attempt loop 13830 1727204127.58701: running the handler 13830 1727204127.58743: handler run complete 13830 1727204127.58752: attempt loop complete, returning result 13830 1727204127.58755: _execute() done 13830 1727204127.58758: dumping result to json 13830 1727204127.58760: done dumping result, returning 13830 1727204127.58767: done running TaskExecutor() for managed-node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0affcd87-79f5-1659-6b02-0000000008f1] 13830 1727204127.58773: sending task result for task 0affcd87-79f5-1659-6b02-0000000008f1 13830 1727204127.58854: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008f1 13830 1727204127.58857: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 13830 1727204127.58910: no more pending results, returning what we have 13830 1727204127.58914: results queue empty 13830 1727204127.58914: checking for any_errors_fatal 13830 1727204127.58920: done checking for any_errors_fatal 13830 1727204127.58921: checking for max_fail_percentage 13830 1727204127.58923: done checking for max_fail_percentage 13830 1727204127.58924: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.58924: done checking to see if all hosts have failed 13830 1727204127.58925: getting the remaining hosts for this loop 13830 1727204127.58927: done getting the remaining hosts for this loop 13830 1727204127.58931: getting the next task for host managed-node3 13830 1727204127.58938: done getting next task for host managed-node3 13830 1727204127.58941: ^ task is: TASK: Cleanup 13830 1727204127.58945: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.58952: getting variables 13830 1727204127.58953: in VariableManager get_vars() 13830 1727204127.59003: Calling all_inventory to load vars for managed-node3 13830 1727204127.59006: Calling groups_inventory to load vars for managed-node3 13830 1727204127.59008: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.59017: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.59019: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.59022: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.59967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.60882: done with get_vars() 13830 1727204127.60897: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.037) 0:01:00.687 ***** 13830 1727204127.60970: entering _queue_task() for managed-node3/include_tasks 13830 1727204127.61204: worker is 1 (out of 1 available) 13830 1727204127.61220: exiting _queue_task() for managed-node3/include_tasks 13830 1727204127.61232: done queuing things up, now waiting for results queue to drain 13830 1727204127.61233: waiting for pending results... 13830 1727204127.61424: running TaskExecutor() for managed-node3/TASK: Cleanup 13830 1727204127.61510: in run() - task 0affcd87-79f5-1659-6b02-0000000008f5 13830 1727204127.61521: variable 'ansible_search_path' from source: unknown 13830 1727204127.61525: variable 'ansible_search_path' from source: unknown 13830 1727204127.61564: variable 'lsr_cleanup' from source: include params 13830 1727204127.61718: variable 'lsr_cleanup' from source: include params 13830 1727204127.61776: variable 'omit' from source: magic vars 13830 1727204127.61887: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.61894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.61904: variable 'omit' from source: magic vars 13830 1727204127.62081: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.62089: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.62094: variable 'item' from source: unknown 13830 1727204127.62146: variable 'item' from source: unknown 13830 1727204127.62170: variable 'item' from source: unknown 13830 1727204127.62212: variable 'item' from source: unknown 13830 1727204127.62345: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.62349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.62352: variable 'omit' from source: magic vars 13830 1727204127.62431: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.62438: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.62444: variable 'item' from source: unknown 13830 1727204127.62490: variable 'item' from source: unknown 13830 1727204127.62514: variable 'item' from source: unknown 13830 1727204127.62558: variable 'item' from source: unknown 13830 1727204127.62631: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.62635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.62639: variable 'omit' from source: magic vars 13830 1727204127.62744: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.62748: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.62751: variable 'item' from source: unknown 13830 1727204127.62800: variable 'item' from source: unknown 13830 1727204127.62821: variable 'item' from source: unknown 13830 1727204127.62864: variable 'item' from source: unknown 13830 1727204127.62938: dumping result to json 13830 1727204127.62940: done dumping result, returning 13830 1727204127.62942: done running TaskExecutor() for managed-node3/TASK: Cleanup [0affcd87-79f5-1659-6b02-0000000008f5] 13830 1727204127.62944: sending task result for task 0affcd87-79f5-1659-6b02-0000000008f5 13830 1727204127.62982: done sending task result for task 0affcd87-79f5-1659-6b02-0000000008f5 13830 1727204127.62984: WORKER PROCESS EXITING 13830 1727204127.63015: no more pending results, returning what we have 13830 1727204127.63024: in VariableManager get_vars() 13830 1727204127.63075: Calling all_inventory to load vars for managed-node3 13830 1727204127.63078: Calling groups_inventory to load vars for managed-node3 13830 1727204127.63080: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.63093: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.63095: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.63098: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.63971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.64894: done with get_vars() 13830 1727204127.64908: variable 'ansible_search_path' from source: unknown 13830 1727204127.64909: variable 'ansible_search_path' from source: unknown 13830 1727204127.64939: variable 'ansible_search_path' from source: unknown 13830 1727204127.64939: variable 'ansible_search_path' from source: unknown 13830 1727204127.64960: variable 'ansible_search_path' from source: unknown 13830 1727204127.64961: variable 'ansible_search_path' from source: unknown 13830 1727204127.64981: we have included files to process 13830 1727204127.64982: generating all_blocks data 13830 1727204127.64983: done generating all_blocks data 13830 1727204127.64986: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 13830 1727204127.64987: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 13830 1727204127.64988: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 13830 1727204127.65114: in VariableManager get_vars() 13830 1727204127.65131: done with get_vars() 13830 1727204127.65135: variable 'omit' from source: magic vars 13830 1727204127.65160: variable 'omit' from source: magic vars 13830 1727204127.65197: in VariableManager get_vars() 13830 1727204127.65209: done with get_vars() 13830 1727204127.65227: in VariableManager get_vars() 13830 1727204127.65241: done with get_vars() 13830 1727204127.65267: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13830 1727204127.65338: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13830 1727204127.65428: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13830 1727204127.65694: in VariableManager get_vars() 13830 1727204127.65709: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204127.67657: done processing included file 13830 1727204127.67666: iterating over new_blocks loaded from include file 13830 1727204127.67668: in VariableManager get_vars() 13830 1727204127.67685: done with get_vars() 13830 1727204127.67686: filtering new block on tags 13830 1727204127.72496: done filtering new block on tags 13830 1727204127.72501: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed-node3 => (item=tasks/cleanup_bond_profile+device.yml) 13830 1727204127.72505: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13830 1727204127.72507: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13830 1727204127.72511: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13830 1727204127.72900: done processing included file 13830 1727204127.72902: iterating over new_blocks loaded from include file 13830 1727204127.72904: in VariableManager get_vars() 13830 1727204127.72926: done with get_vars() 13830 1727204127.72928: filtering new block on tags 13830 1727204127.72957: done filtering new block on tags 13830 1727204127.72960: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed-node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 13830 1727204127.72963: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13830 1727204127.72971: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13830 1727204127.72974: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13830 1727204127.73309: done processing included file 13830 1727204127.73311: iterating over new_blocks loaded from include file 13830 1727204127.73313: in VariableManager get_vars() 13830 1727204127.73333: done with get_vars() 13830 1727204127.73334: filtering new block on tags 13830 1727204127.73365: done filtering new block on tags 13830 1727204127.73368: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 => (item=tasks/check_network_dns.yml) 13830 1727204127.73371: extending task lists for all hosts with included blocks 13830 1727204127.76944: done extending task lists 13830 1727204127.76946: done processing included files 13830 1727204127.76947: results queue empty 13830 1727204127.76947: checking for any_errors_fatal 13830 1727204127.76951: done checking for any_errors_fatal 13830 1727204127.76952: checking for max_fail_percentage 13830 1727204127.76953: done checking for max_fail_percentage 13830 1727204127.76954: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.76955: done checking to see if all hosts have failed 13830 1727204127.76956: getting the remaining hosts for this loop 13830 1727204127.76957: done getting the remaining hosts for this loop 13830 1727204127.76959: getting the next task for host managed-node3 13830 1727204127.76965: done getting next task for host managed-node3 13830 1727204127.76968: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204127.76972: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.76984: getting variables 13830 1727204127.76985: in VariableManager get_vars() 13830 1727204127.77006: Calling all_inventory to load vars for managed-node3 13830 1727204127.77009: Calling groups_inventory to load vars for managed-node3 13830 1727204127.77011: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.77017: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.77019: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.77021: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.78009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.78954: done with get_vars() 13830 1727204127.78974: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.180) 0:01:00.868 ***** 13830 1727204127.79034: entering _queue_task() for managed-node3/include_tasks 13830 1727204127.79349: worker is 1 (out of 1 available) 13830 1727204127.79362: exiting _queue_task() for managed-node3/include_tasks 13830 1727204127.79376: done queuing things up, now waiting for results queue to drain 13830 1727204127.79377: waiting for pending results... 13830 1727204127.79699: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13830 1727204127.79835: in run() - task 0affcd87-79f5-1659-6b02-000000000e0a 13830 1727204127.79854: variable 'ansible_search_path' from source: unknown 13830 1727204127.79858: variable 'ansible_search_path' from source: unknown 13830 1727204127.79899: calling self._execute() 13830 1727204127.80007: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.80011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.80024: variable 'omit' from source: magic vars 13830 1727204127.80452: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.80456: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.80467: _execute() done 13830 1727204127.80470: dumping result to json 13830 1727204127.80473: done dumping result, returning 13830 1727204127.80480: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-1659-6b02-000000000e0a] 13830 1727204127.80498: sending task result for task 0affcd87-79f5-1659-6b02-000000000e0a 13830 1727204127.80591: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e0a 13830 1727204127.80595: WORKER PROCESS EXITING 13830 1727204127.80651: no more pending results, returning what we have 13830 1727204127.80657: in VariableManager get_vars() 13830 1727204127.80718: Calling all_inventory to load vars for managed-node3 13830 1727204127.80721: Calling groups_inventory to load vars for managed-node3 13830 1727204127.80724: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.80741: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.80745: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.80749: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.81778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.82745: done with get_vars() 13830 1727204127.82760: variable 'ansible_search_path' from source: unknown 13830 1727204127.82761: variable 'ansible_search_path' from source: unknown 13830 1727204127.82797: we have included files to process 13830 1727204127.82801: generating all_blocks data 13830 1727204127.82803: done generating all_blocks data 13830 1727204127.82808: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204127.82809: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204127.82815: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13830 1727204127.83416: done processing included file 13830 1727204127.83418: iterating over new_blocks loaded from include file 13830 1727204127.83419: in VariableManager get_vars() 13830 1727204127.83455: done with get_vars() 13830 1727204127.83457: filtering new block on tags 13830 1727204127.83491: done filtering new block on tags 13830 1727204127.83495: in VariableManager get_vars() 13830 1727204127.83527: done with get_vars() 13830 1727204127.83531: filtering new block on tags 13830 1727204127.83584: done filtering new block on tags 13830 1727204127.83587: in VariableManager get_vars() 13830 1727204127.83617: done with get_vars() 13830 1727204127.83619: filtering new block on tags 13830 1727204127.83667: done filtering new block on tags 13830 1727204127.83670: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 13830 1727204127.83675: extending task lists for all hosts with included blocks 13830 1727204127.85262: done extending task lists 13830 1727204127.85265: done processing included files 13830 1727204127.85265: results queue empty 13830 1727204127.85266: checking for any_errors_fatal 13830 1727204127.85269: done checking for any_errors_fatal 13830 1727204127.85269: checking for max_fail_percentage 13830 1727204127.85270: done checking for max_fail_percentage 13830 1727204127.85271: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.85271: done checking to see if all hosts have failed 13830 1727204127.85272: getting the remaining hosts for this loop 13830 1727204127.85273: done getting the remaining hosts for this loop 13830 1727204127.85275: getting the next task for host managed-node3 13830 1727204127.85279: done getting next task for host managed-node3 13830 1727204127.85280: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204127.85283: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.85291: getting variables 13830 1727204127.85292: in VariableManager get_vars() 13830 1727204127.85305: Calling all_inventory to load vars for managed-node3 13830 1727204127.85306: Calling groups_inventory to load vars for managed-node3 13830 1727204127.85308: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.85311: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.85313: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.85314: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.85995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.87285: done with get_vars() 13830 1727204127.87307: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.083) 0:01:00.952 ***** 13830 1727204127.87406: entering _queue_task() for managed-node3/setup 13830 1727204127.87765: worker is 1 (out of 1 available) 13830 1727204127.87779: exiting _queue_task() for managed-node3/setup 13830 1727204127.87791: done queuing things up, now waiting for results queue to drain 13830 1727204127.87793: waiting for pending results... 13830 1727204127.88078: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13830 1727204127.88233: in run() - task 0affcd87-79f5-1659-6b02-000000000fde 13830 1727204127.88237: variable 'ansible_search_path' from source: unknown 13830 1727204127.88240: variable 'ansible_search_path' from source: unknown 13830 1727204127.88259: calling self._execute() 13830 1727204127.88362: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.88381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.88385: variable 'omit' from source: magic vars 13830 1727204127.88787: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.88800: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.89045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204127.90799: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204127.90856: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204127.90886: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204127.90916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204127.90942: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204127.91001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204127.91022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204127.91047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204127.91075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204127.91086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204127.91123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204127.91144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204127.91184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204127.91224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204127.91238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204127.91449: variable '__network_required_facts' from source: role '' defaults 13830 1727204127.91452: variable 'ansible_facts' from source: unknown 13830 1727204127.92149: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13830 1727204127.92153: when evaluation is False, skipping this task 13830 1727204127.92155: _execute() done 13830 1727204127.92158: dumping result to json 13830 1727204127.92160: done dumping result, returning 13830 1727204127.92169: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-1659-6b02-000000000fde] 13830 1727204127.92188: sending task result for task 0affcd87-79f5-1659-6b02-000000000fde 13830 1727204127.92271: done sending task result for task 0affcd87-79f5-1659-6b02-000000000fde 13830 1727204127.92274: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204127.92316: no more pending results, returning what we have 13830 1727204127.92321: results queue empty 13830 1727204127.92321: checking for any_errors_fatal 13830 1727204127.92323: done checking for any_errors_fatal 13830 1727204127.92324: checking for max_fail_percentage 13830 1727204127.92326: done checking for max_fail_percentage 13830 1727204127.92327: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.92327: done checking to see if all hosts have failed 13830 1727204127.92328: getting the remaining hosts for this loop 13830 1727204127.92330: done getting the remaining hosts for this loop 13830 1727204127.92334: getting the next task for host managed-node3 13830 1727204127.92344: done getting next task for host managed-node3 13830 1727204127.92348: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204127.92354: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.92379: getting variables 13830 1727204127.92381: in VariableManager get_vars() 13830 1727204127.92429: Calling all_inventory to load vars for managed-node3 13830 1727204127.92432: Calling groups_inventory to load vars for managed-node3 13830 1727204127.92434: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.92443: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.92445: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.92454: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.93461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.94426: done with get_vars() 13830 1727204127.94450: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.071) 0:01:01.023 ***** 13830 1727204127.94525: entering _queue_task() for managed-node3/stat 13830 1727204127.94782: worker is 1 (out of 1 available) 13830 1727204127.94797: exiting _queue_task() for managed-node3/stat 13830 1727204127.94809: done queuing things up, now waiting for results queue to drain 13830 1727204127.94812: waiting for pending results... 13830 1727204127.95008: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13830 1727204127.95113: in run() - task 0affcd87-79f5-1659-6b02-000000000fe0 13830 1727204127.95124: variable 'ansible_search_path' from source: unknown 13830 1727204127.95127: variable 'ansible_search_path' from source: unknown 13830 1727204127.95159: calling self._execute() 13830 1727204127.95240: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.95244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.95254: variable 'omit' from source: magic vars 13830 1727204127.95540: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.95551: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.95676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204127.95879: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204127.95913: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204127.95939: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204127.95974: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204127.96035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204127.96056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204127.96076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204127.96097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204127.96167: variable '__network_is_ostree' from source: set_fact 13830 1727204127.96173: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204127.96176: when evaluation is False, skipping this task 13830 1727204127.96178: _execute() done 13830 1727204127.96181: dumping result to json 13830 1727204127.96183: done dumping result, returning 13830 1727204127.96196: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-1659-6b02-000000000fe0] 13830 1727204127.96199: sending task result for task 0affcd87-79f5-1659-6b02-000000000fe0 13830 1727204127.96283: done sending task result for task 0affcd87-79f5-1659-6b02-000000000fe0 13830 1727204127.96286: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204127.96341: no more pending results, returning what we have 13830 1727204127.96345: results queue empty 13830 1727204127.96346: checking for any_errors_fatal 13830 1727204127.96354: done checking for any_errors_fatal 13830 1727204127.96354: checking for max_fail_percentage 13830 1727204127.96356: done checking for max_fail_percentage 13830 1727204127.96357: checking to see if all hosts have failed and the running result is not ok 13830 1727204127.96358: done checking to see if all hosts have failed 13830 1727204127.96358: getting the remaining hosts for this loop 13830 1727204127.96360: done getting the remaining hosts for this loop 13830 1727204127.96367: getting the next task for host managed-node3 13830 1727204127.96376: done getting next task for host managed-node3 13830 1727204127.96380: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204127.96386: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204127.96417: getting variables 13830 1727204127.96419: in VariableManager get_vars() 13830 1727204127.96460: Calling all_inventory to load vars for managed-node3 13830 1727204127.96463: Calling groups_inventory to load vars for managed-node3 13830 1727204127.96467: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204127.96476: Calling all_plugins_play to load vars for managed-node3 13830 1727204127.96478: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204127.96481: Calling groups_plugins_play to load vars for managed-node3 13830 1727204127.97476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204127.98402: done with get_vars() 13830 1727204127.98419: done getting variables 13830 1727204127.98466: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:27 -0400 (0:00:00.039) 0:01:01.063 ***** 13830 1727204127.98496: entering _queue_task() for managed-node3/set_fact 13830 1727204127.98732: worker is 1 (out of 1 available) 13830 1727204127.98748: exiting _queue_task() for managed-node3/set_fact 13830 1727204127.98760: done queuing things up, now waiting for results queue to drain 13830 1727204127.98761: waiting for pending results... 13830 1727204127.98959: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13830 1727204127.99073: in run() - task 0affcd87-79f5-1659-6b02-000000000fe1 13830 1727204127.99084: variable 'ansible_search_path' from source: unknown 13830 1727204127.99088: variable 'ansible_search_path' from source: unknown 13830 1727204127.99118: calling self._execute() 13830 1727204127.99193: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204127.99196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204127.99207: variable 'omit' from source: magic vars 13830 1727204127.99486: variable 'ansible_distribution_major_version' from source: facts 13830 1727204127.99496: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204127.99651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204128.00079: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204128.00123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204128.00160: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204128.00197: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204128.00284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204128.00310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204128.00339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204128.00364: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204128.00450: variable '__network_is_ostree' from source: set_fact 13830 1727204128.00457: Evaluated conditional (not __network_is_ostree is defined): False 13830 1727204128.00460: when evaluation is False, skipping this task 13830 1727204128.00463: _execute() done 13830 1727204128.00466: dumping result to json 13830 1727204128.00471: done dumping result, returning 13830 1727204128.00480: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-1659-6b02-000000000fe1] 13830 1727204128.00485: sending task result for task 0affcd87-79f5-1659-6b02-000000000fe1 13830 1727204128.00596: done sending task result for task 0affcd87-79f5-1659-6b02-000000000fe1 13830 1727204128.00600: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13830 1727204128.00653: no more pending results, returning what we have 13830 1727204128.00657: results queue empty 13830 1727204128.00658: checking for any_errors_fatal 13830 1727204128.00668: done checking for any_errors_fatal 13830 1727204128.00669: checking for max_fail_percentage 13830 1727204128.00671: done checking for max_fail_percentage 13830 1727204128.00672: checking to see if all hosts have failed and the running result is not ok 13830 1727204128.00673: done checking to see if all hosts have failed 13830 1727204128.00673: getting the remaining hosts for this loop 13830 1727204128.00675: done getting the remaining hosts for this loop 13830 1727204128.00679: getting the next task for host managed-node3 13830 1727204128.00690: done getting next task for host managed-node3 13830 1727204128.00693: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204128.00699: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204128.00719: getting variables 13830 1727204128.00721: in VariableManager get_vars() 13830 1727204128.00765: Calling all_inventory to load vars for managed-node3 13830 1727204128.00768: Calling groups_inventory to load vars for managed-node3 13830 1727204128.00770: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204128.00779: Calling all_plugins_play to load vars for managed-node3 13830 1727204128.00781: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204128.00783: Calling groups_plugins_play to load vars for managed-node3 13830 1727204128.02329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204128.03565: done with get_vars() 13830 1727204128.03583: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:28 -0400 (0:00:00.051) 0:01:01.114 ***** 13830 1727204128.03659: entering _queue_task() for managed-node3/service_facts 13830 1727204128.03895: worker is 1 (out of 1 available) 13830 1727204128.03909: exiting _queue_task() for managed-node3/service_facts 13830 1727204128.03921: done queuing things up, now waiting for results queue to drain 13830 1727204128.03922: waiting for pending results... 13830 1727204128.04109: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 13830 1727204128.04216: in run() - task 0affcd87-79f5-1659-6b02-000000000fe3 13830 1727204128.04227: variable 'ansible_search_path' from source: unknown 13830 1727204128.04231: variable 'ansible_search_path' from source: unknown 13830 1727204128.04262: calling self._execute() 13830 1727204128.04339: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204128.04342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204128.04350: variable 'omit' from source: magic vars 13830 1727204128.04629: variable 'ansible_distribution_major_version' from source: facts 13830 1727204128.04639: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204128.04646: variable 'omit' from source: magic vars 13830 1727204128.04706: variable 'omit' from source: magic vars 13830 1727204128.04731: variable 'omit' from source: magic vars 13830 1727204128.04768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204128.04796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204128.04814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204128.04827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204128.04838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204128.04860: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204128.04863: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204128.04867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204128.04939: Set connection var ansible_connection to ssh 13830 1727204128.04946: Set connection var ansible_timeout to 10 13830 1727204128.04952: Set connection var ansible_shell_executable to /bin/sh 13830 1727204128.04954: Set connection var ansible_shell_type to sh 13830 1727204128.04960: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204128.04969: Set connection var ansible_pipelining to False 13830 1727204128.04985: variable 'ansible_shell_executable' from source: unknown 13830 1727204128.04987: variable 'ansible_connection' from source: unknown 13830 1727204128.04991: variable 'ansible_module_compression' from source: unknown 13830 1727204128.04993: variable 'ansible_shell_type' from source: unknown 13830 1727204128.04996: variable 'ansible_shell_executable' from source: unknown 13830 1727204128.04998: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204128.05002: variable 'ansible_pipelining' from source: unknown 13830 1727204128.05004: variable 'ansible_timeout' from source: unknown 13830 1727204128.05007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204128.05149: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204128.05158: variable 'omit' from source: magic vars 13830 1727204128.05163: starting attempt loop 13830 1727204128.05168: running the handler 13830 1727204128.05178: _low_level_execute_command(): starting 13830 1727204128.05184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204128.05706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.05722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.05740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204128.05755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.05810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204128.05814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204128.05879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204128.07489: stdout chunk (state=3): >>>/root <<< 13830 1727204128.07592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204128.07649: stderr chunk (state=3): >>><<< 13830 1727204128.07656: stdout chunk (state=3): >>><<< 13830 1727204128.07674: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204128.07687: _low_level_execute_command(): starting 13830 1727204128.07692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492 `" && echo ansible-tmp-1727204128.076764-18205-129078567156492="` echo /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492 `" ) && sleep 0' 13830 1727204128.08398: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204128.08406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.08417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.08431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.08474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.08482: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204128.08494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.08505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204128.08514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204128.08520: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204128.08527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.08539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.08549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.08557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.08571: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204128.08580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.08650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204128.08669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204128.08677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204128.08750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204128.10556: stdout chunk (state=3): >>>ansible-tmp-1727204128.076764-18205-129078567156492=/root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492 <<< 13830 1727204128.10669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204128.10725: stderr chunk (state=3): >>><<< 13830 1727204128.10728: stdout chunk (state=3): >>><<< 13830 1727204128.10745: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204128.076764-18205-129078567156492=/root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204128.10815: variable 'ansible_module_compression' from source: unknown 13830 1727204128.10862: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13830 1727204128.10983: variable 'ansible_facts' from source: unknown 13830 1727204128.11022: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492/AnsiballZ_service_facts.py 13830 1727204128.11337: Sending initial data 13830 1727204128.11340: Sent initial data (161 bytes) 13830 1727204128.12420: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204128.12440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.12454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.12474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.12525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.12544: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204128.12558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.12578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204128.12590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204128.12604: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204128.12618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.12630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.12652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.12666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.12677: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204128.12689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.12780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204128.12799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204128.12818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204128.12907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204128.14597: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204128.14637: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204128.14698: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpp_0w9qv6 /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492/AnsiballZ_service_facts.py <<< 13830 1727204128.14718: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204128.15982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204128.16254: stderr chunk (state=3): >>><<< 13830 1727204128.16258: stdout chunk (state=3): >>><<< 13830 1727204128.16260: done transferring module to remote 13830 1727204128.16263: _low_level_execute_command(): starting 13830 1727204128.16271: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492/ /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492/AnsiballZ_service_facts.py && sleep 0' 13830 1727204128.16876: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204128.16879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.16890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.16910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.16948: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.16955: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204128.16967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.16981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204128.16988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204128.16995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204128.17003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.17017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.17028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.17038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.17041: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204128.17051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.17145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204128.17153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204128.17157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204128.17229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204128.18911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204128.18996: stderr chunk (state=3): >>><<< 13830 1727204128.19000: stdout chunk (state=3): >>><<< 13830 1727204128.19016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204128.19019: _low_level_execute_command(): starting 13830 1727204128.19024: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492/AnsiballZ_service_facts.py && sleep 0' 13830 1727204128.19672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204128.19684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.19694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.19709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.19753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.19760: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204128.19783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.19794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204128.19801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204128.19808: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204128.19816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204128.19826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204128.19838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204128.19846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204128.19852: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204128.19862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204128.19939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204128.19953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204128.19965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204128.20040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204129.48638: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped"<<< 13830 1727204129.48693: stdout chunk (state=3): >>>, "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13830 1727204129.49905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204129.49982: stderr chunk (state=3): >>><<< 13830 1727204129.49986: stdout chunk (state=3): >>><<< 13830 1727204129.50016: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204129.50982: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204129.50989: _low_level_execute_command(): starting 13830 1727204129.50994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204128.076764-18205-129078567156492/ > /dev/null 2>&1 && sleep 0' 13830 1727204129.51661: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204129.51679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.51688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.51722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.51763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204129.51772: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204129.51782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.51797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204129.51803: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204129.51810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204129.51817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.51826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.51838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.51844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204129.51851: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204129.51861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.51945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204129.51952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204129.51955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204129.52037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204129.54038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204129.54139: stderr chunk (state=3): >>><<< 13830 1727204129.54150: stdout chunk (state=3): >>><<< 13830 1727204129.54382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204129.54386: handler run complete 13830 1727204129.54388: variable 'ansible_facts' from source: unknown 13830 1727204129.54555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204129.55058: variable 'ansible_facts' from source: unknown 13830 1727204129.55192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204129.55394: attempt loop complete, returning result 13830 1727204129.55404: _execute() done 13830 1727204129.55410: dumping result to json 13830 1727204129.55475: done dumping result, returning 13830 1727204129.55488: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-1659-6b02-000000000fe3] 13830 1727204129.55498: sending task result for task 0affcd87-79f5-1659-6b02-000000000fe3 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204129.56593: no more pending results, returning what we have 13830 1727204129.56596: results queue empty 13830 1727204129.56597: checking for any_errors_fatal 13830 1727204129.56600: done checking for any_errors_fatal 13830 1727204129.56601: checking for max_fail_percentage 13830 1727204129.56602: done checking for max_fail_percentage 13830 1727204129.56603: checking to see if all hosts have failed and the running result is not ok 13830 1727204129.56604: done checking to see if all hosts have failed 13830 1727204129.56605: getting the remaining hosts for this loop 13830 1727204129.56606: done getting the remaining hosts for this loop 13830 1727204129.56611: getting the next task for host managed-node3 13830 1727204129.56618: done getting next task for host managed-node3 13830 1727204129.56621: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204129.56628: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204129.56641: getting variables 13830 1727204129.56643: in VariableManager get_vars() 13830 1727204129.56684: Calling all_inventory to load vars for managed-node3 13830 1727204129.56687: Calling groups_inventory to load vars for managed-node3 13830 1727204129.56689: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204129.56699: Calling all_plugins_play to load vars for managed-node3 13830 1727204129.56701: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204129.56703: Calling groups_plugins_play to load vars for managed-node3 13830 1727204129.57390: done sending task result for task 0affcd87-79f5-1659-6b02-000000000fe3 13830 1727204129.57393: WORKER PROCESS EXITING 13830 1727204129.58095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204129.59153: done with get_vars() 13830 1727204129.59175: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:29 -0400 (0:00:01.555) 0:01:02.670 ***** 13830 1727204129.59250: entering _queue_task() for managed-node3/package_facts 13830 1727204129.59500: worker is 1 (out of 1 available) 13830 1727204129.59512: exiting _queue_task() for managed-node3/package_facts 13830 1727204129.59525: done queuing things up, now waiting for results queue to drain 13830 1727204129.59526: waiting for pending results... 13830 1727204129.59754: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13830 1727204129.59973: in run() - task 0affcd87-79f5-1659-6b02-000000000fe4 13830 1727204129.60019: variable 'ansible_search_path' from source: unknown 13830 1727204129.60031: variable 'ansible_search_path' from source: unknown 13830 1727204129.60088: calling self._execute() 13830 1727204129.60198: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204129.60209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204129.60222: variable 'omit' from source: magic vars 13830 1727204129.60636: variable 'ansible_distribution_major_version' from source: facts 13830 1727204129.60654: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204129.60665: variable 'omit' from source: magic vars 13830 1727204129.60769: variable 'omit' from source: magic vars 13830 1727204129.60851: variable 'omit' from source: magic vars 13830 1727204129.60901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204129.60948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204129.60977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204129.60998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204129.61012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204129.61053: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204129.61061: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204129.61074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204129.61188: Set connection var ansible_connection to ssh 13830 1727204129.61205: Set connection var ansible_timeout to 10 13830 1727204129.61222: Set connection var ansible_shell_executable to /bin/sh 13830 1727204129.61228: Set connection var ansible_shell_type to sh 13830 1727204129.61239: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204129.61255: Set connection var ansible_pipelining to False 13830 1727204129.61287: variable 'ansible_shell_executable' from source: unknown 13830 1727204129.61295: variable 'ansible_connection' from source: unknown 13830 1727204129.61302: variable 'ansible_module_compression' from source: unknown 13830 1727204129.61308: variable 'ansible_shell_type' from source: unknown 13830 1727204129.61314: variable 'ansible_shell_executable' from source: unknown 13830 1727204129.61320: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204129.61328: variable 'ansible_pipelining' from source: unknown 13830 1727204129.61334: variable 'ansible_timeout' from source: unknown 13830 1727204129.61341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204129.61571: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204129.61594: variable 'omit' from source: magic vars 13830 1727204129.61604: starting attempt loop 13830 1727204129.61610: running the handler 13830 1727204129.61628: _low_level_execute_command(): starting 13830 1727204129.61639: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204129.62550: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.62555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.62590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204129.62596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.62599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204129.62601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.62673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204129.62679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204129.62691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204129.62729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204129.64261: stdout chunk (state=3): >>>/root <<< 13830 1727204129.64357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204129.64449: stderr chunk (state=3): >>><<< 13830 1727204129.64453: stdout chunk (state=3): >>><<< 13830 1727204129.64588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204129.64592: _low_level_execute_command(): starting 13830 1727204129.64596: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856 `" && echo ansible-tmp-1727204129.644775-18314-232733875648856="` echo /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856 `" ) && sleep 0' 13830 1727204129.65420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204129.65444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.65460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.65480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.65528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204129.65541: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204129.65566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.65586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204129.65600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204129.65615: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204129.65628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.65644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.65670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.65682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204129.65693: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204129.65707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.65813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204129.65837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204129.65856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204129.65934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204129.67735: stdout chunk (state=3): >>>ansible-tmp-1727204129.644775-18314-232733875648856=/root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856 <<< 13830 1727204129.67836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204129.67900: stderr chunk (state=3): >>><<< 13830 1727204129.67907: stdout chunk (state=3): >>><<< 13830 1727204129.67943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204129.644775-18314-232733875648856=/root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204129.67974: variable 'ansible_module_compression' from source: unknown 13830 1727204129.68016: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13830 1727204129.68067: variable 'ansible_facts' from source: unknown 13830 1727204129.68206: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856/AnsiballZ_package_facts.py 13830 1727204129.68358: Sending initial data 13830 1727204129.68362: Sent initial data (161 bytes) 13830 1727204129.69252: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.69257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.69294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.69297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.69300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204129.69302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.69351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204129.69354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204129.69406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204129.71091: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204129.71125: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204129.71161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpx3mftyoe /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856/AnsiballZ_package_facts.py <<< 13830 1727204129.71199: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204129.72925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204129.73040: stderr chunk (state=3): >>><<< 13830 1727204129.73044: stdout chunk (state=3): >>><<< 13830 1727204129.73059: done transferring module to remote 13830 1727204129.73074: _low_level_execute_command(): starting 13830 1727204129.73078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856/ /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856/AnsiballZ_package_facts.py && sleep 0' 13830 1727204129.73571: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.73577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.73618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204129.73647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204129.73670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.73694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204129.73714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204129.73755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204129.75557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204129.75572: stdout chunk (state=3): >>><<< 13830 1727204129.75587: stderr chunk (state=3): >>><<< 13830 1727204129.75605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204129.75612: _low_level_execute_command(): starting 13830 1727204129.75620: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856/AnsiballZ_package_facts.py && sleep 0' 13830 1727204129.76286: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204129.76299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.76311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.76329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.76375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204129.76386: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204129.76400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.76415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204129.76425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204129.76437: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204129.76451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204129.76462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204129.76477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204129.76486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204129.76494: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204129.76508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204129.76587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204129.76609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204129.76623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204129.76702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204130.23467: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 13830 1727204130.23491: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 13830 1727204130.23530: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 13830 1727204130.23571: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 13830 1727204130.23588: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 13830 1727204130.23595: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 13830 1727204130.23599: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 13830 1727204130.23619: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 13830 1727204130.23647: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 13830 1727204130.23656: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 13830 1727204130.23686: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 13830 1727204130.23698: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 13830 1727204130.23719: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13830 1727204130.25197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204130.25262: stderr chunk (state=3): >>><<< 13830 1727204130.25267: stdout chunk (state=3): >>><<< 13830 1727204130.25306: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204130.26774: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204130.26789: _low_level_execute_command(): starting 13830 1727204130.26794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204129.644775-18314-232733875648856/ > /dev/null 2>&1 && sleep 0' 13830 1727204130.27278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204130.27286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204130.27319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204130.27331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204130.27382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204130.27394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204130.27413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204130.27527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204130.29348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204130.29404: stderr chunk (state=3): >>><<< 13830 1727204130.29407: stdout chunk (state=3): >>><<< 13830 1727204130.29422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204130.29425: handler run complete 13830 1727204130.29942: variable 'ansible_facts' from source: unknown 13830 1727204130.30287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.31472: variable 'ansible_facts' from source: unknown 13830 1727204130.31748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.32198: attempt loop complete, returning result 13830 1727204130.32208: _execute() done 13830 1727204130.32211: dumping result to json 13830 1727204130.32336: done dumping result, returning 13830 1727204130.32344: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-1659-6b02-000000000fe4] 13830 1727204130.32351: sending task result for task 0affcd87-79f5-1659-6b02-000000000fe4 13830 1727204130.35206: done sending task result for task 0affcd87-79f5-1659-6b02-000000000fe4 13830 1727204130.35210: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204130.35401: no more pending results, returning what we have 13830 1727204130.35405: results queue empty 13830 1727204130.35406: checking for any_errors_fatal 13830 1727204130.35411: done checking for any_errors_fatal 13830 1727204130.35412: checking for max_fail_percentage 13830 1727204130.35414: done checking for max_fail_percentage 13830 1727204130.35415: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.35416: done checking to see if all hosts have failed 13830 1727204130.35417: getting the remaining hosts for this loop 13830 1727204130.35418: done getting the remaining hosts for this loop 13830 1727204130.35422: getting the next task for host managed-node3 13830 1727204130.35430: done getting next task for host managed-node3 13830 1727204130.35437: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204130.35444: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.35458: getting variables 13830 1727204130.35459: in VariableManager get_vars() 13830 1727204130.35506: Calling all_inventory to load vars for managed-node3 13830 1727204130.35509: Calling groups_inventory to load vars for managed-node3 13830 1727204130.35512: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.35521: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.35524: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.35527: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.36908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.37881: done with get_vars() 13830 1727204130.37901: done getting variables 13830 1727204130.37949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.787) 0:01:03.457 ***** 13830 1727204130.37985: entering _queue_task() for managed-node3/debug 13830 1727204130.38432: worker is 1 (out of 1 available) 13830 1727204130.38446: exiting _queue_task() for managed-node3/debug 13830 1727204130.38742: done queuing things up, now waiting for results queue to drain 13830 1727204130.38744: waiting for pending results... 13830 1727204130.38767: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 13830 1727204130.38773: in run() - task 0affcd87-79f5-1659-6b02-000000000e0b 13830 1727204130.38779: variable 'ansible_search_path' from source: unknown 13830 1727204130.38783: variable 'ansible_search_path' from source: unknown 13830 1727204130.38816: calling self._execute() 13830 1727204130.38896: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.38901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.38907: variable 'omit' from source: magic vars 13830 1727204130.39202: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.39214: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.39219: variable 'omit' from source: magic vars 13830 1727204130.39271: variable 'omit' from source: magic vars 13830 1727204130.39352: variable 'network_provider' from source: set_fact 13830 1727204130.39367: variable 'omit' from source: magic vars 13830 1727204130.39404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204130.39443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204130.39472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204130.39493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204130.39508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204130.39538: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204130.39545: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.39552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.39647: Set connection var ansible_connection to ssh 13830 1727204130.39662: Set connection var ansible_timeout to 10 13830 1727204130.39673: Set connection var ansible_shell_executable to /bin/sh 13830 1727204130.39681: Set connection var ansible_shell_type to sh 13830 1727204130.39689: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204130.39701: Set connection var ansible_pipelining to False 13830 1727204130.39725: variable 'ansible_shell_executable' from source: unknown 13830 1727204130.39732: variable 'ansible_connection' from source: unknown 13830 1727204130.39737: variable 'ansible_module_compression' from source: unknown 13830 1727204130.39743: variable 'ansible_shell_type' from source: unknown 13830 1727204130.39748: variable 'ansible_shell_executable' from source: unknown 13830 1727204130.39753: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.39758: variable 'ansible_pipelining' from source: unknown 13830 1727204130.39765: variable 'ansible_timeout' from source: unknown 13830 1727204130.39772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.39895: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204130.39909: variable 'omit' from source: magic vars 13830 1727204130.39917: starting attempt loop 13830 1727204130.39922: running the handler 13830 1727204130.39968: handler run complete 13830 1727204130.39984: attempt loop complete, returning result 13830 1727204130.39989: _execute() done 13830 1727204130.39994: dumping result to json 13830 1727204130.40001: done dumping result, returning 13830 1727204130.40010: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-1659-6b02-000000000e0b] 13830 1727204130.40019: sending task result for task 0affcd87-79f5-1659-6b02-000000000e0b ok: [managed-node3] => {} MSG: Using network provider: nm 13830 1727204130.40172: no more pending results, returning what we have 13830 1727204130.40176: results queue empty 13830 1727204130.40177: checking for any_errors_fatal 13830 1727204130.40186: done checking for any_errors_fatal 13830 1727204130.40187: checking for max_fail_percentage 13830 1727204130.40189: done checking for max_fail_percentage 13830 1727204130.40190: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.40191: done checking to see if all hosts have failed 13830 1727204130.40192: getting the remaining hosts for this loop 13830 1727204130.40193: done getting the remaining hosts for this loop 13830 1727204130.40197: getting the next task for host managed-node3 13830 1727204130.40206: done getting next task for host managed-node3 13830 1727204130.40210: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204130.40216: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.40229: getting variables 13830 1727204130.40231: in VariableManager get_vars() 13830 1727204130.40280: Calling all_inventory to load vars for managed-node3 13830 1727204130.40283: Calling groups_inventory to load vars for managed-node3 13830 1727204130.40285: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.40295: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.40297: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.40299: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.41333: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e0b 13830 1727204130.41339: WORKER PROCESS EXITING 13830 1727204130.41357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.42308: done with get_vars() 13830 1727204130.42331: done getting variables 13830 1727204130.42379: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.044) 0:01:03.502 ***** 13830 1727204130.42415: entering _queue_task() for managed-node3/fail 13830 1727204130.42667: worker is 1 (out of 1 available) 13830 1727204130.42682: exiting _queue_task() for managed-node3/fail 13830 1727204130.42695: done queuing things up, now waiting for results queue to drain 13830 1727204130.42697: waiting for pending results... 13830 1727204130.42895: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13830 1727204130.42995: in run() - task 0affcd87-79f5-1659-6b02-000000000e0c 13830 1727204130.43008: variable 'ansible_search_path' from source: unknown 13830 1727204130.43011: variable 'ansible_search_path' from source: unknown 13830 1727204130.43042: calling self._execute() 13830 1727204130.43127: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.43131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.43141: variable 'omit' from source: magic vars 13830 1727204130.43447: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.43457: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.43550: variable 'network_state' from source: role '' defaults 13830 1727204130.43559: Evaluated conditional (network_state != {}): False 13830 1727204130.43563: when evaluation is False, skipping this task 13830 1727204130.43567: _execute() done 13830 1727204130.43570: dumping result to json 13830 1727204130.43572: done dumping result, returning 13830 1727204130.43581: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-1659-6b02-000000000e0c] 13830 1727204130.43587: sending task result for task 0affcd87-79f5-1659-6b02-000000000e0c 13830 1727204130.43683: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e0c 13830 1727204130.43686: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204130.43739: no more pending results, returning what we have 13830 1727204130.43743: results queue empty 13830 1727204130.43744: checking for any_errors_fatal 13830 1727204130.43750: done checking for any_errors_fatal 13830 1727204130.43751: checking for max_fail_percentage 13830 1727204130.43753: done checking for max_fail_percentage 13830 1727204130.43753: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.43754: done checking to see if all hosts have failed 13830 1727204130.43755: getting the remaining hosts for this loop 13830 1727204130.43757: done getting the remaining hosts for this loop 13830 1727204130.43761: getting the next task for host managed-node3 13830 1727204130.43771: done getting next task for host managed-node3 13830 1727204130.43775: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204130.43790: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.43818: getting variables 13830 1727204130.43821: in VariableManager get_vars() 13830 1727204130.43862: Calling all_inventory to load vars for managed-node3 13830 1727204130.43866: Calling groups_inventory to load vars for managed-node3 13830 1727204130.43868: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.43878: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.43880: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.43882: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.44720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.45680: done with get_vars() 13830 1727204130.45700: done getting variables 13830 1727204130.45745: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.033) 0:01:03.535 ***** 13830 1727204130.45777: entering _queue_task() for managed-node3/fail 13830 1727204130.46023: worker is 1 (out of 1 available) 13830 1727204130.46037: exiting _queue_task() for managed-node3/fail 13830 1727204130.46050: done queuing things up, now waiting for results queue to drain 13830 1727204130.46052: waiting for pending results... 13830 1727204130.46260: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13830 1727204130.46409: in run() - task 0affcd87-79f5-1659-6b02-000000000e0d 13830 1727204130.46430: variable 'ansible_search_path' from source: unknown 13830 1727204130.46440: variable 'ansible_search_path' from source: unknown 13830 1727204130.46480: calling self._execute() 13830 1727204130.46576: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.46586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.46597: variable 'omit' from source: magic vars 13830 1727204130.46942: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.46958: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.47075: variable 'network_state' from source: role '' defaults 13830 1727204130.47089: Evaluated conditional (network_state != {}): False 13830 1727204130.47095: when evaluation is False, skipping this task 13830 1727204130.47100: _execute() done 13830 1727204130.47106: dumping result to json 13830 1727204130.47112: done dumping result, returning 13830 1727204130.47122: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-1659-6b02-000000000e0d] 13830 1727204130.47132: sending task result for task 0affcd87-79f5-1659-6b02-000000000e0d skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204130.47284: no more pending results, returning what we have 13830 1727204130.47289: results queue empty 13830 1727204130.47289: checking for any_errors_fatal 13830 1727204130.47298: done checking for any_errors_fatal 13830 1727204130.47299: checking for max_fail_percentage 13830 1727204130.47300: done checking for max_fail_percentage 13830 1727204130.47301: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.47302: done checking to see if all hosts have failed 13830 1727204130.47303: getting the remaining hosts for this loop 13830 1727204130.47305: done getting the remaining hosts for this loop 13830 1727204130.47309: getting the next task for host managed-node3 13830 1727204130.47317: done getting next task for host managed-node3 13830 1727204130.47321: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204130.47332: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.47367: getting variables 13830 1727204130.47369: in VariableManager get_vars() 13830 1727204130.47415: Calling all_inventory to load vars for managed-node3 13830 1727204130.47418: Calling groups_inventory to load vars for managed-node3 13830 1727204130.47420: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.47433: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.47443: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.47447: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.47971: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e0d 13830 1727204130.47976: WORKER PROCESS EXITING 13830 1727204130.48750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.49684: done with get_vars() 13830 1727204130.49703: done getting variables 13830 1727204130.49750: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.040) 0:01:03.575 ***** 13830 1727204130.49779: entering _queue_task() for managed-node3/fail 13830 1727204130.50029: worker is 1 (out of 1 available) 13830 1727204130.50042: exiting _queue_task() for managed-node3/fail 13830 1727204130.50055: done queuing things up, now waiting for results queue to drain 13830 1727204130.50056: waiting for pending results... 13830 1727204130.50261: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13830 1727204130.50368: in run() - task 0affcd87-79f5-1659-6b02-000000000e0e 13830 1727204130.50381: variable 'ansible_search_path' from source: unknown 13830 1727204130.50386: variable 'ansible_search_path' from source: unknown 13830 1727204130.50415: calling self._execute() 13830 1727204130.50499: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.50503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.50511: variable 'omit' from source: magic vars 13830 1727204130.50800: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.50814: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.50942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204130.52629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204130.52681: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204130.52710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204130.52736: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204130.52762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204130.52827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.52850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.52869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.52900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.52911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.52989: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.53003: Evaluated conditional (ansible_distribution_major_version | int > 9): False 13830 1727204130.53006: when evaluation is False, skipping this task 13830 1727204130.53009: _execute() done 13830 1727204130.53012: dumping result to json 13830 1727204130.53014: done dumping result, returning 13830 1727204130.53021: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-1659-6b02-000000000e0e] 13830 1727204130.53026: sending task result for task 0affcd87-79f5-1659-6b02-000000000e0e 13830 1727204130.53123: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e0e 13830 1727204130.53126: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 13830 1727204130.53178: no more pending results, returning what we have 13830 1727204130.53182: results queue empty 13830 1727204130.53183: checking for any_errors_fatal 13830 1727204130.53190: done checking for any_errors_fatal 13830 1727204130.53190: checking for max_fail_percentage 13830 1727204130.53192: done checking for max_fail_percentage 13830 1727204130.53193: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.53194: done checking to see if all hosts have failed 13830 1727204130.53194: getting the remaining hosts for this loop 13830 1727204130.53196: done getting the remaining hosts for this loop 13830 1727204130.53201: getting the next task for host managed-node3 13830 1727204130.53208: done getting next task for host managed-node3 13830 1727204130.53213: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204130.53219: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.53249: getting variables 13830 1727204130.53251: in VariableManager get_vars() 13830 1727204130.53298: Calling all_inventory to load vars for managed-node3 13830 1727204130.53301: Calling groups_inventory to load vars for managed-node3 13830 1727204130.53303: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.53312: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.53314: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.53316: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.54159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.55216: done with get_vars() 13830 1727204130.55233: done getting variables 13830 1727204130.55280: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.055) 0:01:03.631 ***** 13830 1727204130.55308: entering _queue_task() for managed-node3/dnf 13830 1727204130.55557: worker is 1 (out of 1 available) 13830 1727204130.55577: exiting _queue_task() for managed-node3/dnf 13830 1727204130.55590: done queuing things up, now waiting for results queue to drain 13830 1727204130.55592: waiting for pending results... 13830 1727204130.55801: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13830 1727204130.55905: in run() - task 0affcd87-79f5-1659-6b02-000000000e0f 13830 1727204130.55915: variable 'ansible_search_path' from source: unknown 13830 1727204130.55920: variable 'ansible_search_path' from source: unknown 13830 1727204130.55952: calling self._execute() 13830 1727204130.56028: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.56033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.56043: variable 'omit' from source: magic vars 13830 1727204130.56324: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.56334: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.56480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204130.58173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204130.58220: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204130.58253: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204130.58280: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204130.58300: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204130.58367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.58389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.58407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.58439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.58451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.58546: variable 'ansible_distribution' from source: facts 13830 1727204130.58550: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.58567: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13830 1727204130.58650: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204130.58738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.58761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.58782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.58810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.58822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.58851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.58871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.58896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.58923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.58936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.58961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.58982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.59000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.59026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.59038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.59147: variable 'network_connections' from source: task vars 13830 1727204130.59157: variable 'port2_profile' from source: play vars 13830 1727204130.59207: variable 'port2_profile' from source: play vars 13830 1727204130.59217: variable 'port1_profile' from source: play vars 13830 1727204130.59263: variable 'port1_profile' from source: play vars 13830 1727204130.59272: variable 'controller_profile' from source: play vars 13830 1727204130.59315: variable 'controller_profile' from source: play vars 13830 1727204130.59370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204130.59511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204130.59542: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204130.59572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204130.59595: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204130.59630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204130.59658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204130.59676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.59693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204130.59736: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204130.59900: variable 'network_connections' from source: task vars 13830 1727204130.59904: variable 'port2_profile' from source: play vars 13830 1727204130.59950: variable 'port2_profile' from source: play vars 13830 1727204130.59956: variable 'port1_profile' from source: play vars 13830 1727204130.60002: variable 'port1_profile' from source: play vars 13830 1727204130.60008: variable 'controller_profile' from source: play vars 13830 1727204130.60052: variable 'controller_profile' from source: play vars 13830 1727204130.60074: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204130.60081: when evaluation is False, skipping this task 13830 1727204130.60084: _execute() done 13830 1727204130.60087: dumping result to json 13830 1727204130.60089: done dumping result, returning 13830 1727204130.60100: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000e0f] 13830 1727204130.60105: sending task result for task 0affcd87-79f5-1659-6b02-000000000e0f 13830 1727204130.60206: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e0f 13830 1727204130.60209: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204130.60257: no more pending results, returning what we have 13830 1727204130.60261: results queue empty 13830 1727204130.60262: checking for any_errors_fatal 13830 1727204130.60272: done checking for any_errors_fatal 13830 1727204130.60272: checking for max_fail_percentage 13830 1727204130.60274: done checking for max_fail_percentage 13830 1727204130.60275: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.60276: done checking to see if all hosts have failed 13830 1727204130.60277: getting the remaining hosts for this loop 13830 1727204130.60278: done getting the remaining hosts for this loop 13830 1727204130.60283: getting the next task for host managed-node3 13830 1727204130.60295: done getting next task for host managed-node3 13830 1727204130.60299: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204130.60309: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.60334: getting variables 13830 1727204130.60336: in VariableManager get_vars() 13830 1727204130.60380: Calling all_inventory to load vars for managed-node3 13830 1727204130.60383: Calling groups_inventory to load vars for managed-node3 13830 1727204130.60385: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.60398: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.60401: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.60404: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.61277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.62218: done with get_vars() 13830 1727204130.62244: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13830 1727204130.62303: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.070) 0:01:03.701 ***** 13830 1727204130.62330: entering _queue_task() for managed-node3/yum 13830 1727204130.62597: worker is 1 (out of 1 available) 13830 1727204130.62612: exiting _queue_task() for managed-node3/yum 13830 1727204130.62625: done queuing things up, now waiting for results queue to drain 13830 1727204130.62626: waiting for pending results... 13830 1727204130.62827: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13830 1727204130.62948: in run() - task 0affcd87-79f5-1659-6b02-000000000e10 13830 1727204130.62959: variable 'ansible_search_path' from source: unknown 13830 1727204130.62963: variable 'ansible_search_path' from source: unknown 13830 1727204130.62998: calling self._execute() 13830 1727204130.63076: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.63081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.63088: variable 'omit' from source: magic vars 13830 1727204130.63375: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.63386: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.63513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204130.65213: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204130.65266: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204130.65295: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204130.65321: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204130.65344: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204130.65408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.65428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.65449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.65479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.65494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.65573: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.65588: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13830 1727204130.65595: when evaluation is False, skipping this task 13830 1727204130.65599: _execute() done 13830 1727204130.65601: dumping result to json 13830 1727204130.65604: done dumping result, returning 13830 1727204130.65612: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000e10] 13830 1727204130.65622: sending task result for task 0affcd87-79f5-1659-6b02-000000000e10 13830 1727204130.65723: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e10 13830 1727204130.65726: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13830 1727204130.65780: no more pending results, returning what we have 13830 1727204130.65784: results queue empty 13830 1727204130.65785: checking for any_errors_fatal 13830 1727204130.65794: done checking for any_errors_fatal 13830 1727204130.65794: checking for max_fail_percentage 13830 1727204130.65796: done checking for max_fail_percentage 13830 1727204130.65797: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.65797: done checking to see if all hosts have failed 13830 1727204130.65798: getting the remaining hosts for this loop 13830 1727204130.65800: done getting the remaining hosts for this loop 13830 1727204130.65804: getting the next task for host managed-node3 13830 1727204130.65813: done getting next task for host managed-node3 13830 1727204130.65817: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204130.65823: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.65851: getting variables 13830 1727204130.65853: in VariableManager get_vars() 13830 1727204130.65902: Calling all_inventory to load vars for managed-node3 13830 1727204130.65904: Calling groups_inventory to load vars for managed-node3 13830 1727204130.65906: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.65918: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.65921: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.65926: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.67187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.68516: done with get_vars() 13830 1727204130.68551: done getting variables 13830 1727204130.68614: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.063) 0:01:03.764 ***** 13830 1727204130.68652: entering _queue_task() for managed-node3/fail 13830 1727204130.69010: worker is 1 (out of 1 available) 13830 1727204130.69023: exiting _queue_task() for managed-node3/fail 13830 1727204130.69038: done queuing things up, now waiting for results queue to drain 13830 1727204130.69040: waiting for pending results... 13830 1727204130.69353: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13830 1727204130.69540: in run() - task 0affcd87-79f5-1659-6b02-000000000e11 13830 1727204130.69565: variable 'ansible_search_path' from source: unknown 13830 1727204130.69580: variable 'ansible_search_path' from source: unknown 13830 1727204130.69626: calling self._execute() 13830 1727204130.69728: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.69741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.69754: variable 'omit' from source: magic vars 13830 1727204130.70159: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.70179: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.70310: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204130.70528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204130.73128: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204130.73221: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204130.73278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204130.73328: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204130.73366: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204130.73468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.73503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.73546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.73598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.73620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.73689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.73719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.73756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.73808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.73830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.73890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.74852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.74890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.74949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.75059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.75507: variable 'network_connections' from source: task vars 13830 1727204130.75526: variable 'port2_profile' from source: play vars 13830 1727204130.75712: variable 'port2_profile' from source: play vars 13830 1727204130.75773: variable 'port1_profile' from source: play vars 13830 1727204130.75968: variable 'port1_profile' from source: play vars 13830 1727204130.75986: variable 'controller_profile' from source: play vars 13830 1727204130.76069: variable 'controller_profile' from source: play vars 13830 1727204130.76220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204130.76807: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204130.76861: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204130.77011: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204130.77049: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204130.77178: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204130.77359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204130.77394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.77429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204130.77609: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204130.78156: variable 'network_connections' from source: task vars 13830 1727204130.78169: variable 'port2_profile' from source: play vars 13830 1727204130.78253: variable 'port2_profile' from source: play vars 13830 1727204130.78262: variable 'port1_profile' from source: play vars 13830 1727204130.78332: variable 'port1_profile' from source: play vars 13830 1727204130.78343: variable 'controller_profile' from source: play vars 13830 1727204130.78402: variable 'controller_profile' from source: play vars 13830 1727204130.78550: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204130.78561: when evaluation is False, skipping this task 13830 1727204130.78565: _execute() done 13830 1727204130.78568: dumping result to json 13830 1727204130.78570: done dumping result, returning 13830 1727204130.78573: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000e11] 13830 1727204130.78576: sending task result for task 0affcd87-79f5-1659-6b02-000000000e11 13830 1727204130.78682: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e11 13830 1727204130.78687: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204130.78740: no more pending results, returning what we have 13830 1727204130.78745: results queue empty 13830 1727204130.78746: checking for any_errors_fatal 13830 1727204130.78753: done checking for any_errors_fatal 13830 1727204130.78754: checking for max_fail_percentage 13830 1727204130.78756: done checking for max_fail_percentage 13830 1727204130.78757: checking to see if all hosts have failed and the running result is not ok 13830 1727204130.78758: done checking to see if all hosts have failed 13830 1727204130.78758: getting the remaining hosts for this loop 13830 1727204130.78760: done getting the remaining hosts for this loop 13830 1727204130.78765: getting the next task for host managed-node3 13830 1727204130.78773: done getting next task for host managed-node3 13830 1727204130.78777: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13830 1727204130.78782: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204130.78805: getting variables 13830 1727204130.78806: in VariableManager get_vars() 13830 1727204130.78856: Calling all_inventory to load vars for managed-node3 13830 1727204130.78859: Calling groups_inventory to load vars for managed-node3 13830 1727204130.78862: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204130.78936: Calling all_plugins_play to load vars for managed-node3 13830 1727204130.78940: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204130.78943: Calling groups_plugins_play to load vars for managed-node3 13830 1727204130.80314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204130.82101: done with get_vars() 13830 1727204130.82142: done getting variables 13830 1727204130.82215: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.136) 0:01:03.900 ***** 13830 1727204130.82260: entering _queue_task() for managed-node3/package 13830 1727204130.82627: worker is 1 (out of 1 available) 13830 1727204130.82642: exiting _queue_task() for managed-node3/package 13830 1727204130.82654: done queuing things up, now waiting for results queue to drain 13830 1727204130.82656: waiting for pending results... 13830 1727204130.82956: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 13830 1727204130.83119: in run() - task 0affcd87-79f5-1659-6b02-000000000e12 13830 1727204130.83139: variable 'ansible_search_path' from source: unknown 13830 1727204130.83147: variable 'ansible_search_path' from source: unknown 13830 1727204130.83190: calling self._execute() 13830 1727204130.83292: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204130.83302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204130.83320: variable 'omit' from source: magic vars 13830 1727204130.83707: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.83730: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204130.83937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204130.84216: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204130.84263: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204130.84307: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204130.84342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204130.84465: variable 'network_packages' from source: role '' defaults 13830 1727204130.84580: variable '__network_provider_setup' from source: role '' defaults 13830 1727204130.84597: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204130.84669: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204130.84686: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204130.84754: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204130.84986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204130.93990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204130.94084: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204130.94130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204130.94176: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204130.94206: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204130.94292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.94327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.94360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.94417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.94437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.94494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.94525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.94555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.94610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.94629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.94850: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204130.94968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.94996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.95028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.95072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.95091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.95221: variable 'ansible_python' from source: facts 13830 1727204130.95248: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204130.95334: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204130.95421: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204130.95550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.95586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.95618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.95663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.95690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.95741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204130.95786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204130.95817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.95861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204130.95887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204130.96035: variable 'network_connections' from source: task vars 13830 1727204130.96046: variable 'port2_profile' from source: play vars 13830 1727204130.96152: variable 'port2_profile' from source: play vars 13830 1727204130.96171: variable 'port1_profile' from source: play vars 13830 1727204130.96275: variable 'port1_profile' from source: play vars 13830 1727204130.96290: variable 'controller_profile' from source: play vars 13830 1727204130.96392: variable 'controller_profile' from source: play vars 13830 1727204130.96488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204130.96517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204130.96552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204130.96589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204130.96633: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204130.96931: variable 'network_connections' from source: task vars 13830 1727204130.96941: variable 'port2_profile' from source: play vars 13830 1727204130.97052: variable 'port2_profile' from source: play vars 13830 1727204130.97069: variable 'port1_profile' from source: play vars 13830 1727204130.97179: variable 'port1_profile' from source: play vars 13830 1727204130.97197: variable 'controller_profile' from source: play vars 13830 1727204130.97307: variable 'controller_profile' from source: play vars 13830 1727204130.97343: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204130.97424: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204130.97734: variable 'network_connections' from source: task vars 13830 1727204130.97746: variable 'port2_profile' from source: play vars 13830 1727204130.97809: variable 'port2_profile' from source: play vars 13830 1727204130.97820: variable 'port1_profile' from source: play vars 13830 1727204130.97886: variable 'port1_profile' from source: play vars 13830 1727204130.97897: variable 'controller_profile' from source: play vars 13830 1727204130.97958: variable 'controller_profile' from source: play vars 13830 1727204130.97992: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204130.98103: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204130.98418: variable 'network_connections' from source: task vars 13830 1727204130.98428: variable 'port2_profile' from source: play vars 13830 1727204130.98500: variable 'port2_profile' from source: play vars 13830 1727204130.98513: variable 'port1_profile' from source: play vars 13830 1727204130.98582: variable 'port1_profile' from source: play vars 13830 1727204130.98593: variable 'controller_profile' from source: play vars 13830 1727204130.98660: variable 'controller_profile' from source: play vars 13830 1727204130.98722: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204130.98786: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204130.98798: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204130.98867: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204130.99098: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204130.99616: variable 'network_connections' from source: task vars 13830 1727204130.99626: variable 'port2_profile' from source: play vars 13830 1727204130.99695: variable 'port2_profile' from source: play vars 13830 1727204130.99707: variable 'port1_profile' from source: play vars 13830 1727204130.99770: variable 'port1_profile' from source: play vars 13830 1727204130.99783: variable 'controller_profile' from source: play vars 13830 1727204130.99847: variable 'controller_profile' from source: play vars 13830 1727204130.99861: variable 'ansible_distribution' from source: facts 13830 1727204130.99873: variable '__network_rh_distros' from source: role '' defaults 13830 1727204130.99882: variable 'ansible_distribution_major_version' from source: facts 13830 1727204130.99901: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204131.00085: variable 'ansible_distribution' from source: facts 13830 1727204131.00093: variable '__network_rh_distros' from source: role '' defaults 13830 1727204131.00102: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.00117: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204131.00304: variable 'ansible_distribution' from source: facts 13830 1727204131.00313: variable '__network_rh_distros' from source: role '' defaults 13830 1727204131.00322: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.00372: variable 'network_provider' from source: set_fact 13830 1727204131.00393: variable 'ansible_facts' from source: unknown 13830 1727204131.01184: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13830 1727204131.01193: when evaluation is False, skipping this task 13830 1727204131.01200: _execute() done 13830 1727204131.01206: dumping result to json 13830 1727204131.01219: done dumping result, returning 13830 1727204131.01231: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-1659-6b02-000000000e12] 13830 1727204131.01243: sending task result for task 0affcd87-79f5-1659-6b02-000000000e12 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13830 1727204131.01416: no more pending results, returning what we have 13830 1727204131.01420: results queue empty 13830 1727204131.01421: checking for any_errors_fatal 13830 1727204131.01430: done checking for any_errors_fatal 13830 1727204131.01431: checking for max_fail_percentage 13830 1727204131.01436: done checking for max_fail_percentage 13830 1727204131.01437: checking to see if all hosts have failed and the running result is not ok 13830 1727204131.01438: done checking to see if all hosts have failed 13830 1727204131.01438: getting the remaining hosts for this loop 13830 1727204131.01440: done getting the remaining hosts for this loop 13830 1727204131.01444: getting the next task for host managed-node3 13830 1727204131.01453: done getting next task for host managed-node3 13830 1727204131.01468: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204131.01473: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204131.01497: getting variables 13830 1727204131.01499: in VariableManager get_vars() 13830 1727204131.01555: Calling all_inventory to load vars for managed-node3 13830 1727204131.01558: Calling groups_inventory to load vars for managed-node3 13830 1727204131.01560: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204131.01573: Calling all_plugins_play to load vars for managed-node3 13830 1727204131.01576: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204131.01580: Calling groups_plugins_play to load vars for managed-node3 13830 1727204131.02975: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e12 13830 1727204131.02978: WORKER PROCESS EXITING 13830 1727204131.12615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204131.16036: done with get_vars() 13830 1727204131.16076: done getting variables 13830 1727204131.16128: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.339) 0:01:04.239 ***** 13830 1727204131.16167: entering _queue_task() for managed-node3/package 13830 1727204131.16624: worker is 1 (out of 1 available) 13830 1727204131.16640: exiting _queue_task() for managed-node3/package 13830 1727204131.16651: done queuing things up, now waiting for results queue to drain 13830 1727204131.16653: waiting for pending results... 13830 1727204131.16956: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13830 1727204131.17145: in run() - task 0affcd87-79f5-1659-6b02-000000000e13 13830 1727204131.17167: variable 'ansible_search_path' from source: unknown 13830 1727204131.17177: variable 'ansible_search_path' from source: unknown 13830 1727204131.17231: calling self._execute() 13830 1727204131.17342: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204131.17354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204131.17371: variable 'omit' from source: magic vars 13830 1727204131.17800: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.17817: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204131.17948: variable 'network_state' from source: role '' defaults 13830 1727204131.17969: Evaluated conditional (network_state != {}): False 13830 1727204131.17984: when evaluation is False, skipping this task 13830 1727204131.17993: _execute() done 13830 1727204131.18002: dumping result to json 13830 1727204131.18011: done dumping result, returning 13830 1727204131.18023: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-1659-6b02-000000000e13] 13830 1727204131.18040: sending task result for task 0affcd87-79f5-1659-6b02-000000000e13 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204131.18222: no more pending results, returning what we have 13830 1727204131.18227: results queue empty 13830 1727204131.18227: checking for any_errors_fatal 13830 1727204131.18239: done checking for any_errors_fatal 13830 1727204131.18240: checking for max_fail_percentage 13830 1727204131.18242: done checking for max_fail_percentage 13830 1727204131.18243: checking to see if all hosts have failed and the running result is not ok 13830 1727204131.18244: done checking to see if all hosts have failed 13830 1727204131.18244: getting the remaining hosts for this loop 13830 1727204131.18247: done getting the remaining hosts for this loop 13830 1727204131.18251: getting the next task for host managed-node3 13830 1727204131.18261: done getting next task for host managed-node3 13830 1727204131.18267: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204131.18273: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204131.18300: getting variables 13830 1727204131.18303: in VariableManager get_vars() 13830 1727204131.18359: Calling all_inventory to load vars for managed-node3 13830 1727204131.18362: Calling groups_inventory to load vars for managed-node3 13830 1727204131.18368: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204131.18381: Calling all_plugins_play to load vars for managed-node3 13830 1727204131.18384: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204131.18387: Calling groups_plugins_play to load vars for managed-node3 13830 1727204131.19499: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e13 13830 1727204131.19505: WORKER PROCESS EXITING 13830 1727204131.20248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204131.22123: done with get_vars() 13830 1727204131.22159: done getting variables 13830 1727204131.22227: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.061) 0:01:04.300 ***** 13830 1727204131.22274: entering _queue_task() for managed-node3/package 13830 1727204131.22635: worker is 1 (out of 1 available) 13830 1727204131.22648: exiting _queue_task() for managed-node3/package 13830 1727204131.22661: done queuing things up, now waiting for results queue to drain 13830 1727204131.22663: waiting for pending results... 13830 1727204131.22975: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13830 1727204131.23153: in run() - task 0affcd87-79f5-1659-6b02-000000000e14 13830 1727204131.23177: variable 'ansible_search_path' from source: unknown 13830 1727204131.23186: variable 'ansible_search_path' from source: unknown 13830 1727204131.23232: calling self._execute() 13830 1727204131.23346: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204131.23357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204131.23375: variable 'omit' from source: magic vars 13830 1727204131.23773: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.23792: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204131.23926: variable 'network_state' from source: role '' defaults 13830 1727204131.23947: Evaluated conditional (network_state != {}): False 13830 1727204131.23955: when evaluation is False, skipping this task 13830 1727204131.23962: _execute() done 13830 1727204131.23972: dumping result to json 13830 1727204131.23984: done dumping result, returning 13830 1727204131.23996: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-1659-6b02-000000000e14] 13830 1727204131.24006: sending task result for task 0affcd87-79f5-1659-6b02-000000000e14 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204131.24185: no more pending results, returning what we have 13830 1727204131.24190: results queue empty 13830 1727204131.24191: checking for any_errors_fatal 13830 1727204131.24201: done checking for any_errors_fatal 13830 1727204131.24202: checking for max_fail_percentage 13830 1727204131.24204: done checking for max_fail_percentage 13830 1727204131.24206: checking to see if all hosts have failed and the running result is not ok 13830 1727204131.24206: done checking to see if all hosts have failed 13830 1727204131.24207: getting the remaining hosts for this loop 13830 1727204131.24209: done getting the remaining hosts for this loop 13830 1727204131.24214: getting the next task for host managed-node3 13830 1727204131.24224: done getting next task for host managed-node3 13830 1727204131.24230: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204131.24241: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204131.24269: getting variables 13830 1727204131.24272: in VariableManager get_vars() 13830 1727204131.24324: Calling all_inventory to load vars for managed-node3 13830 1727204131.24327: Calling groups_inventory to load vars for managed-node3 13830 1727204131.24329: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204131.24344: Calling all_plugins_play to load vars for managed-node3 13830 1727204131.24347: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204131.24350: Calling groups_plugins_play to load vars for managed-node3 13830 1727204131.25304: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e14 13830 1727204131.25308: WORKER PROCESS EXITING 13830 1727204131.26318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204131.28056: done with get_vars() 13830 1727204131.28086: done getting variables 13830 1727204131.28152: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.059) 0:01:04.360 ***** 13830 1727204131.28194: entering _queue_task() for managed-node3/service 13830 1727204131.28568: worker is 1 (out of 1 available) 13830 1727204131.28581: exiting _queue_task() for managed-node3/service 13830 1727204131.28593: done queuing things up, now waiting for results queue to drain 13830 1727204131.28595: waiting for pending results... 13830 1727204131.28915: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13830 1727204131.29105: in run() - task 0affcd87-79f5-1659-6b02-000000000e15 13830 1727204131.29125: variable 'ansible_search_path' from source: unknown 13830 1727204131.29131: variable 'ansible_search_path' from source: unknown 13830 1727204131.29181: calling self._execute() 13830 1727204131.29289: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204131.29299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204131.29310: variable 'omit' from source: magic vars 13830 1727204131.29701: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.29720: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204131.29855: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204131.30074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204131.36056: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204131.36173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204131.36349: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204131.36392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204131.36550: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204131.36749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.36787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.36819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.36982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.37001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.37049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.37084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.37195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.37241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.37259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.37324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.37415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.37525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.37572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.37623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.37929: variable 'network_connections' from source: task vars 13830 1727204131.38061: variable 'port2_profile' from source: play vars 13830 1727204131.38143: variable 'port2_profile' from source: play vars 13830 1727204131.38278: variable 'port1_profile' from source: play vars 13830 1727204131.38348: variable 'port1_profile' from source: play vars 13830 1727204131.38493: variable 'controller_profile' from source: play vars 13830 1727204131.38556: variable 'controller_profile' from source: play vars 13830 1727204131.38749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204131.39175: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204131.39218: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204131.39260: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204131.39382: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204131.39432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204131.39485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204131.39597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.39628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204131.39799: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204131.40282: variable 'network_connections' from source: task vars 13830 1727204131.40292: variable 'port2_profile' from source: play vars 13830 1727204131.40410: variable 'port2_profile' from source: play vars 13830 1727204131.40443: variable 'port1_profile' from source: play vars 13830 1727204131.40596: variable 'port1_profile' from source: play vars 13830 1727204131.40608: variable 'controller_profile' from source: play vars 13830 1727204131.40680: variable 'controller_profile' from source: play vars 13830 1727204131.40711: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13830 1727204131.40729: when evaluation is False, skipping this task 13830 1727204131.40737: _execute() done 13830 1727204131.40744: dumping result to json 13830 1727204131.40754: done dumping result, returning 13830 1727204131.40774: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-1659-6b02-000000000e15] 13830 1727204131.40785: sending task result for task 0affcd87-79f5-1659-6b02-000000000e15 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13830 1727204131.40945: no more pending results, returning what we have 13830 1727204131.40949: results queue empty 13830 1727204131.40950: checking for any_errors_fatal 13830 1727204131.40956: done checking for any_errors_fatal 13830 1727204131.40956: checking for max_fail_percentage 13830 1727204131.40958: done checking for max_fail_percentage 13830 1727204131.40959: checking to see if all hosts have failed and the running result is not ok 13830 1727204131.40960: done checking to see if all hosts have failed 13830 1727204131.40961: getting the remaining hosts for this loop 13830 1727204131.40962: done getting the remaining hosts for this loop 13830 1727204131.40969: getting the next task for host managed-node3 13830 1727204131.40979: done getting next task for host managed-node3 13830 1727204131.40983: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204131.40988: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204131.41014: getting variables 13830 1727204131.41016: in VariableManager get_vars() 13830 1727204131.41068: Calling all_inventory to load vars for managed-node3 13830 1727204131.41071: Calling groups_inventory to load vars for managed-node3 13830 1727204131.41073: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204131.41084: Calling all_plugins_play to load vars for managed-node3 13830 1727204131.41086: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204131.41089: Calling groups_plugins_play to load vars for managed-node3 13830 1727204131.41853: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e15 13830 1727204131.41856: WORKER PROCESS EXITING 13830 1727204131.43791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204131.47684: done with get_vars() 13830 1727204131.47712: done getting variables 13830 1727204131.47982: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.198) 0:01:04.558 ***** 13830 1727204131.48023: entering _queue_task() for managed-node3/service 13830 1727204131.48600: worker is 1 (out of 1 available) 13830 1727204131.48613: exiting _queue_task() for managed-node3/service 13830 1727204131.48626: done queuing things up, now waiting for results queue to drain 13830 1727204131.48628: waiting for pending results... 13830 1727204131.49630: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13830 1727204131.49935: in run() - task 0affcd87-79f5-1659-6b02-000000000e16 13830 1727204131.50398: variable 'ansible_search_path' from source: unknown 13830 1727204131.50405: variable 'ansible_search_path' from source: unknown 13830 1727204131.50453: calling self._execute() 13830 1727204131.50570: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204131.50585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204131.50603: variable 'omit' from source: magic vars 13830 1727204131.51006: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.51024: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204131.51197: variable 'network_provider' from source: set_fact 13830 1727204131.51206: variable 'network_state' from source: role '' defaults 13830 1727204131.51220: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13830 1727204131.51230: variable 'omit' from source: magic vars 13830 1727204131.51311: variable 'omit' from source: magic vars 13830 1727204131.51347: variable 'network_service_name' from source: role '' defaults 13830 1727204131.51423: variable 'network_service_name' from source: role '' defaults 13830 1727204131.51542: variable '__network_provider_setup' from source: role '' defaults 13830 1727204131.51554: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204131.51624: variable '__network_service_name_default_nm' from source: role '' defaults 13830 1727204131.51640: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204131.51704: variable '__network_packages_default_nm' from source: role '' defaults 13830 1727204131.51939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204131.54357: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204131.54447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204131.54494: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204131.54533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204131.54575: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204131.54660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.54696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.54726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.54781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.54799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.54849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.54881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.54909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.54956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.54982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.55228: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13830 1727204131.55358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.55389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.55421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.55469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.55487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.55589: variable 'ansible_python' from source: facts 13830 1727204131.55613: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13830 1727204131.55707: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204131.55797: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204131.55936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.55973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.56001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.56048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.56074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.56125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204131.56168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204131.56198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.56243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204131.56260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204131.56418: variable 'network_connections' from source: task vars 13830 1727204131.56430: variable 'port2_profile' from source: play vars 13830 1727204131.56515: variable 'port2_profile' from source: play vars 13830 1727204131.56536: variable 'port1_profile' from source: play vars 13830 1727204131.56616: variable 'port1_profile' from source: play vars 13830 1727204131.56639: variable 'controller_profile' from source: play vars 13830 1727204131.56719: variable 'controller_profile' from source: play vars 13830 1727204131.56845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204131.57061: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204131.57112: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204131.57162: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204131.57214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204131.57289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204131.57318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204131.57353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204131.57399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204131.57455: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204131.57760: variable 'network_connections' from source: task vars 13830 1727204131.57776: variable 'port2_profile' from source: play vars 13830 1727204131.57861: variable 'port2_profile' from source: play vars 13830 1727204131.57882: variable 'port1_profile' from source: play vars 13830 1727204131.57970: variable 'port1_profile' from source: play vars 13830 1727204131.58030: variable 'controller_profile' from source: play vars 13830 1727204131.58110: variable 'controller_profile' from source: play vars 13830 1727204131.58274: variable '__network_packages_default_wireless' from source: role '' defaults 13830 1727204131.58477: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204131.59149: variable 'network_connections' from source: task vars 13830 1727204131.59224: variable 'port2_profile' from source: play vars 13830 1727204131.59354: variable 'port2_profile' from source: play vars 13830 1727204131.59446: variable 'port1_profile' from source: play vars 13830 1727204131.59522: variable 'port1_profile' from source: play vars 13830 1727204131.59661: variable 'controller_profile' from source: play vars 13830 1727204131.59738: variable 'controller_profile' from source: play vars 13830 1727204131.59886: variable '__network_packages_default_team' from source: role '' defaults 13830 1727204131.59972: variable '__network_team_connections_defined' from source: role '' defaults 13830 1727204131.60659: variable 'network_connections' from source: task vars 13830 1727204131.60672: variable 'port2_profile' from source: play vars 13830 1727204131.60817: variable 'port2_profile' from source: play vars 13830 1727204131.60862: variable 'port1_profile' from source: play vars 13830 1727204131.60940: variable 'port1_profile' from source: play vars 13830 1727204131.61081: variable 'controller_profile' from source: play vars 13830 1727204131.61155: variable 'controller_profile' from source: play vars 13830 1727204131.61349: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204131.61492: variable '__network_service_name_default_initscripts' from source: role '' defaults 13830 1727204131.61509: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204131.61578: variable '__network_packages_default_initscripts' from source: role '' defaults 13830 1727204131.62115: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13830 1727204131.63309: variable 'network_connections' from source: task vars 13830 1727204131.63320: variable 'port2_profile' from source: play vars 13830 1727204131.63504: variable 'port2_profile' from source: play vars 13830 1727204131.63517: variable 'port1_profile' from source: play vars 13830 1727204131.63695: variable 'port1_profile' from source: play vars 13830 1727204131.63707: variable 'controller_profile' from source: play vars 13830 1727204131.63769: variable 'controller_profile' from source: play vars 13830 1727204131.63781: variable 'ansible_distribution' from source: facts 13830 1727204131.63792: variable '__network_rh_distros' from source: role '' defaults 13830 1727204131.63803: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.63822: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13830 1727204131.64160: variable 'ansible_distribution' from source: facts 13830 1727204131.64228: variable '__network_rh_distros' from source: role '' defaults 13830 1727204131.64240: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.64283: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13830 1727204131.64619: variable 'ansible_distribution' from source: facts 13830 1727204131.64772: variable '__network_rh_distros' from source: role '' defaults 13830 1727204131.64784: variable 'ansible_distribution_major_version' from source: facts 13830 1727204131.64826: variable 'network_provider' from source: set_fact 13830 1727204131.64857: variable 'omit' from source: magic vars 13830 1727204131.64893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204131.65011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204131.65039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204131.65061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204131.65079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204131.65228: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204131.65241: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204131.65249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204131.65360: Set connection var ansible_connection to ssh 13830 1727204131.65427: Set connection var ansible_timeout to 10 13830 1727204131.65532: Set connection var ansible_shell_executable to /bin/sh 13830 1727204131.65542: Set connection var ansible_shell_type to sh 13830 1727204131.65552: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204131.65566: Set connection var ansible_pipelining to False 13830 1727204131.65595: variable 'ansible_shell_executable' from source: unknown 13830 1727204131.65602: variable 'ansible_connection' from source: unknown 13830 1727204131.65608: variable 'ansible_module_compression' from source: unknown 13830 1727204131.65615: variable 'ansible_shell_type' from source: unknown 13830 1727204131.65620: variable 'ansible_shell_executable' from source: unknown 13830 1727204131.65627: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204131.65640: variable 'ansible_pipelining' from source: unknown 13830 1727204131.65646: variable 'ansible_timeout' from source: unknown 13830 1727204131.65749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204131.65981: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204131.66000: variable 'omit' from source: magic vars 13830 1727204131.66012: starting attempt loop 13830 1727204131.66020: running the handler 13830 1727204131.66112: variable 'ansible_facts' from source: unknown 13830 1727204131.67852: _low_level_execute_command(): starting 13830 1727204131.67867: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204131.68765: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204131.68786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.68801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.68821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.68869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.68882: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204131.68906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.68926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204131.68947: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204131.68960: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204131.68981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.68996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.69022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.69039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.69052: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204131.69075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.69183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204131.69212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204131.69233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204131.69312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204131.70931: stdout chunk (state=3): >>>/root <<< 13830 1727204131.71032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204131.71124: stderr chunk (state=3): >>><<< 13830 1727204131.71140: stdout chunk (state=3): >>><<< 13830 1727204131.71263: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204131.71269: _low_level_execute_command(): starting 13830 1727204131.71272: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888 `" && echo ansible-tmp-1727204131.711722-18554-40983051847888="` echo /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888 `" ) && sleep 0' 13830 1727204131.72355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204131.72383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.72399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.72417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.72608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.72624: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204131.72641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.72665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204131.72679: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204131.72689: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204131.72700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.72731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.72750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.72785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.72796: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204131.72809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.72909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204131.72930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204131.72950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204131.73027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204131.75120: stdout chunk (state=3): >>>ansible-tmp-1727204131.711722-18554-40983051847888=/root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888 <<< 13830 1727204131.75275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204131.75336: stderr chunk (state=3): >>><<< 13830 1727204131.75340: stdout chunk (state=3): >>><<< 13830 1727204131.75356: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204131.711722-18554-40983051847888=/root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204131.75394: variable 'ansible_module_compression' from source: unknown 13830 1727204131.75450: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13830 1727204131.75511: variable 'ansible_facts' from source: unknown 13830 1727204131.75702: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888/AnsiballZ_systemd.py 13830 1727204131.75856: Sending initial data 13830 1727204131.75859: Sent initial data (154 bytes) 13830 1727204131.77750: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204131.77763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.77779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.77800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.77848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.77859: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204131.77874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.77889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204131.77899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204131.77917: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204131.77928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.77942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.77956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.77968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.77980: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204131.77994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.78074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204131.78093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204131.78109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204131.78203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204131.79876: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204131.79923: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204131.79970: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp1ttvilio /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888/AnsiballZ_systemd.py <<< 13830 1727204131.80008: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204131.82399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204131.82535: stderr chunk (state=3): >>><<< 13830 1727204131.82539: stdout chunk (state=3): >>><<< 13830 1727204131.82542: done transferring module to remote 13830 1727204131.82544: _low_level_execute_command(): starting 13830 1727204131.82546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888/ /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888/AnsiballZ_systemd.py && sleep 0' 13830 1727204131.83222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204131.83238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.83254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.83276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.83322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.83336: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204131.83352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.83374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204131.83387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204131.83399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204131.83411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.83425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.83441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.83455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.83469: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204131.83484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.83576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204131.83594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204131.83609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204131.83693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204131.85443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204131.85447: stdout chunk (state=3): >>><<< 13830 1727204131.85455: stderr chunk (state=3): >>><<< 13830 1727204131.85472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204131.85475: _low_level_execute_command(): starting 13830 1727204131.85481: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888/AnsiballZ_systemd.py && sleep 0' 13830 1727204131.86141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204131.86150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.86161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.86180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.86218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.86225: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204131.86235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.86252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204131.86260: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204131.86267: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204131.86278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204131.86287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204131.86298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204131.86306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204131.86311: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204131.86321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204131.86394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204131.86413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204131.86424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204131.86506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204132.11752: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 13830 1727204132.11777: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "16019456", "MemoryAvailable": "infinity", "CPUUsageNSec": "1160346000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13830 1727204132.13605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204132.13610: stdout chunk (state=3): >>><<< 13830 1727204132.13613: stderr chunk (state=3): >>><<< 13830 1727204132.13617: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16019456", "MemoryAvailable": "infinity", "CPUUsageNSec": "1160346000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204132.13626: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204132.13630: _low_level_execute_command(): starting 13830 1727204132.13632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204131.711722-18554-40983051847888/ > /dev/null 2>&1 && sleep 0' 13830 1727204132.14221: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204132.14238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.14254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.14275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.14322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.14335: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204132.14350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.14373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204132.14387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204132.14398: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204132.14412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.14429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.14446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.14460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.14475: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204132.14490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.14574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204132.14594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204132.14609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204132.14690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204132.16508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204132.16511: stdout chunk (state=3): >>><<< 13830 1727204132.16514: stderr chunk (state=3): >>><<< 13830 1727204132.16517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204132.16527: handler run complete 13830 1727204132.16589: attempt loop complete, returning result 13830 1727204132.16592: _execute() done 13830 1727204132.16594: dumping result to json 13830 1727204132.16610: done dumping result, returning 13830 1727204132.16619: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-1659-6b02-000000000e16] 13830 1727204132.16626: sending task result for task 0affcd87-79f5-1659-6b02-000000000e16 13830 1727204132.16921: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e16 13830 1727204132.16924: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204132.16982: no more pending results, returning what we have 13830 1727204132.16986: results queue empty 13830 1727204132.16987: checking for any_errors_fatal 13830 1727204132.16993: done checking for any_errors_fatal 13830 1727204132.16994: checking for max_fail_percentage 13830 1727204132.16995: done checking for max_fail_percentage 13830 1727204132.16996: checking to see if all hosts have failed and the running result is not ok 13830 1727204132.16997: done checking to see if all hosts have failed 13830 1727204132.16998: getting the remaining hosts for this loop 13830 1727204132.16999: done getting the remaining hosts for this loop 13830 1727204132.17003: getting the next task for host managed-node3 13830 1727204132.17009: done getting next task for host managed-node3 13830 1727204132.17013: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204132.17018: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204132.17029: getting variables 13830 1727204132.17031: in VariableManager get_vars() 13830 1727204132.17077: Calling all_inventory to load vars for managed-node3 13830 1727204132.17079: Calling groups_inventory to load vars for managed-node3 13830 1727204132.17082: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204132.17090: Calling all_plugins_play to load vars for managed-node3 13830 1727204132.17093: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204132.17095: Calling groups_plugins_play to load vars for managed-node3 13830 1727204132.19805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204132.23718: done with get_vars() 13830 1727204132.23828: done getting variables 13830 1727204132.23901: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.759) 0:01:05.317 ***** 13830 1727204132.23938: entering _queue_task() for managed-node3/service 13830 1727204132.24397: worker is 1 (out of 1 available) 13830 1727204132.24427: exiting _queue_task() for managed-node3/service 13830 1727204132.24440: done queuing things up, now waiting for results queue to drain 13830 1727204132.24442: waiting for pending results... 13830 1727204132.24777: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13830 1727204132.24954: in run() - task 0affcd87-79f5-1659-6b02-000000000e17 13830 1727204132.24982: variable 'ansible_search_path' from source: unknown 13830 1727204132.24992: variable 'ansible_search_path' from source: unknown 13830 1727204132.25042: calling self._execute() 13830 1727204132.25154: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204132.25171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204132.25189: variable 'omit' from source: magic vars 13830 1727204132.25631: variable 'ansible_distribution_major_version' from source: facts 13830 1727204132.25650: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204132.25779: variable 'network_provider' from source: set_fact 13830 1727204132.25790: Evaluated conditional (network_provider == "nm"): True 13830 1727204132.25891: variable '__network_wpa_supplicant_required' from source: role '' defaults 13830 1727204132.25987: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13830 1727204132.26167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204132.29160: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204132.29245: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204132.29294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204132.29345: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204132.29378: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204132.29472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204132.29571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204132.29602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204132.29693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204132.29798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204132.29982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204132.30012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204132.30110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204132.30169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204132.30194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204132.30243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204132.30274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204132.30314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204132.30359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204132.30379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204132.30580: variable 'network_connections' from source: task vars 13830 1727204132.30598: variable 'port2_profile' from source: play vars 13830 1727204132.30752: variable 'port2_profile' from source: play vars 13830 1727204132.30771: variable 'port1_profile' from source: play vars 13830 1727204132.30862: variable 'port1_profile' from source: play vars 13830 1727204132.30882: variable 'controller_profile' from source: play vars 13830 1727204132.30937: variable 'controller_profile' from source: play vars 13830 1727204132.30998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13830 1727204132.31138: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13830 1727204132.31172: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13830 1727204132.31197: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13830 1727204132.31222: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13830 1727204132.31254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13830 1727204132.31271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13830 1727204132.31290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204132.31307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13830 1727204132.31348: variable '__network_wireless_connections_defined' from source: role '' defaults 13830 1727204132.31507: variable 'network_connections' from source: task vars 13830 1727204132.31511: variable 'port2_profile' from source: play vars 13830 1727204132.31553: variable 'port2_profile' from source: play vars 13830 1727204132.31560: variable 'port1_profile' from source: play vars 13830 1727204132.31604: variable 'port1_profile' from source: play vars 13830 1727204132.31619: variable 'controller_profile' from source: play vars 13830 1727204132.31659: variable 'controller_profile' from source: play vars 13830 1727204132.31682: Evaluated conditional (__network_wpa_supplicant_required): False 13830 1727204132.31685: when evaluation is False, skipping this task 13830 1727204132.31688: _execute() done 13830 1727204132.31690: dumping result to json 13830 1727204132.31692: done dumping result, returning 13830 1727204132.31700: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-1659-6b02-000000000e17] 13830 1727204132.31705: sending task result for task 0affcd87-79f5-1659-6b02-000000000e17 13830 1727204132.31804: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e17 13830 1727204132.31807: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13830 1727204132.31883: no more pending results, returning what we have 13830 1727204132.31888: results queue empty 13830 1727204132.31889: checking for any_errors_fatal 13830 1727204132.31909: done checking for any_errors_fatal 13830 1727204132.31910: checking for max_fail_percentage 13830 1727204132.31912: done checking for max_fail_percentage 13830 1727204132.31913: checking to see if all hosts have failed and the running result is not ok 13830 1727204132.31914: done checking to see if all hosts have failed 13830 1727204132.31915: getting the remaining hosts for this loop 13830 1727204132.31918: done getting the remaining hosts for this loop 13830 1727204132.32077: getting the next task for host managed-node3 13830 1727204132.32086: done getting next task for host managed-node3 13830 1727204132.32091: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204132.32096: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204132.32116: getting variables 13830 1727204132.32117: in VariableManager get_vars() 13830 1727204132.32160: Calling all_inventory to load vars for managed-node3 13830 1727204132.32162: Calling groups_inventory to load vars for managed-node3 13830 1727204132.32167: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204132.32176: Calling all_plugins_play to load vars for managed-node3 13830 1727204132.32179: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204132.32182: Calling groups_plugins_play to load vars for managed-node3 13830 1727204132.33859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204132.36155: done with get_vars() 13830 1727204132.36194: done getting variables 13830 1727204132.36256: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.123) 0:01:05.441 ***** 13830 1727204132.36299: entering _queue_task() for managed-node3/service 13830 1727204132.36694: worker is 1 (out of 1 available) 13830 1727204132.36707: exiting _queue_task() for managed-node3/service 13830 1727204132.36725: done queuing things up, now waiting for results queue to drain 13830 1727204132.36726: waiting for pending results... 13830 1727204132.37139: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 13830 1727204132.37303: in run() - task 0affcd87-79f5-1659-6b02-000000000e18 13830 1727204132.37316: variable 'ansible_search_path' from source: unknown 13830 1727204132.37320: variable 'ansible_search_path' from source: unknown 13830 1727204132.37355: calling self._execute() 13830 1727204132.37456: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204132.37461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204132.37473: variable 'omit' from source: magic vars 13830 1727204132.37881: variable 'ansible_distribution_major_version' from source: facts 13830 1727204132.37894: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204132.38018: variable 'network_provider' from source: set_fact 13830 1727204132.38022: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204132.38024: when evaluation is False, skipping this task 13830 1727204132.38036: _execute() done 13830 1727204132.38040: dumping result to json 13830 1727204132.38043: done dumping result, returning 13830 1727204132.38056: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-1659-6b02-000000000e18] 13830 1727204132.38062: sending task result for task 0affcd87-79f5-1659-6b02-000000000e18 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13830 1727204132.38211: no more pending results, returning what we have 13830 1727204132.38216: results queue empty 13830 1727204132.38217: checking for any_errors_fatal 13830 1727204132.38227: done checking for any_errors_fatal 13830 1727204132.38228: checking for max_fail_percentage 13830 1727204132.38230: done checking for max_fail_percentage 13830 1727204132.38231: checking to see if all hosts have failed and the running result is not ok 13830 1727204132.38232: done checking to see if all hosts have failed 13830 1727204132.38233: getting the remaining hosts for this loop 13830 1727204132.38234: done getting the remaining hosts for this loop 13830 1727204132.38239: getting the next task for host managed-node3 13830 1727204132.38249: done getting next task for host managed-node3 13830 1727204132.38253: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204132.38259: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204132.38278: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e18 13830 1727204132.38281: WORKER PROCESS EXITING 13830 1727204132.38301: getting variables 13830 1727204132.38303: in VariableManager get_vars() 13830 1727204132.38355: Calling all_inventory to load vars for managed-node3 13830 1727204132.38358: Calling groups_inventory to load vars for managed-node3 13830 1727204132.38360: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204132.38374: Calling all_plugins_play to load vars for managed-node3 13830 1727204132.38377: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204132.38380: Calling groups_plugins_play to load vars for managed-node3 13830 1727204132.40253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204132.42444: done with get_vars() 13830 1727204132.42474: done getting variables 13830 1727204132.42548: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.062) 0:01:05.503 ***** 13830 1727204132.42589: entering _queue_task() for managed-node3/copy 13830 1727204132.43968: worker is 1 (out of 1 available) 13830 1727204132.43978: exiting _queue_task() for managed-node3/copy 13830 1727204132.43988: done queuing things up, now waiting for results queue to drain 13830 1727204132.43990: waiting for pending results... 13830 1727204132.44011: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13830 1727204132.44120: in run() - task 0affcd87-79f5-1659-6b02-000000000e19 13830 1727204132.44142: variable 'ansible_search_path' from source: unknown 13830 1727204132.44150: variable 'ansible_search_path' from source: unknown 13830 1727204132.44195: calling self._execute() 13830 1727204132.44307: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204132.44320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204132.44337: variable 'omit' from source: magic vars 13830 1727204132.44737: variable 'ansible_distribution_major_version' from source: facts 13830 1727204132.44756: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204132.44884: variable 'network_provider' from source: set_fact 13830 1727204132.44896: Evaluated conditional (network_provider == "initscripts"): False 13830 1727204132.44904: when evaluation is False, skipping this task 13830 1727204132.44910: _execute() done 13830 1727204132.44917: dumping result to json 13830 1727204132.44924: done dumping result, returning 13830 1727204132.44942: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-1659-6b02-000000000e19] 13830 1727204132.44953: sending task result for task 0affcd87-79f5-1659-6b02-000000000e19 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13830 1727204132.45125: no more pending results, returning what we have 13830 1727204132.45129: results queue empty 13830 1727204132.45130: checking for any_errors_fatal 13830 1727204132.45136: done checking for any_errors_fatal 13830 1727204132.45137: checking for max_fail_percentage 13830 1727204132.45139: done checking for max_fail_percentage 13830 1727204132.45140: checking to see if all hosts have failed and the running result is not ok 13830 1727204132.45140: done checking to see if all hosts have failed 13830 1727204132.45141: getting the remaining hosts for this loop 13830 1727204132.45143: done getting the remaining hosts for this loop 13830 1727204132.45146: getting the next task for host managed-node3 13830 1727204132.45155: done getting next task for host managed-node3 13830 1727204132.45158: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204132.45165: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204132.45183: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e19 13830 1727204132.45186: WORKER PROCESS EXITING 13830 1727204132.45281: getting variables 13830 1727204132.45283: in VariableManager get_vars() 13830 1727204132.45327: Calling all_inventory to load vars for managed-node3 13830 1727204132.45329: Calling groups_inventory to load vars for managed-node3 13830 1727204132.45331: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204132.45341: Calling all_plugins_play to load vars for managed-node3 13830 1727204132.45343: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204132.45346: Calling groups_plugins_play to load vars for managed-node3 13830 1727204132.47029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204132.49218: done with get_vars() 13830 1727204132.49256: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.067) 0:01:05.571 ***** 13830 1727204132.49391: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204132.49793: worker is 1 (out of 1 available) 13830 1727204132.49807: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 13830 1727204132.49819: done queuing things up, now waiting for results queue to drain 13830 1727204132.49821: waiting for pending results... 13830 1727204132.50132: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13830 1727204132.50297: in run() - task 0affcd87-79f5-1659-6b02-000000000e1a 13830 1727204132.50313: variable 'ansible_search_path' from source: unknown 13830 1727204132.50317: variable 'ansible_search_path' from source: unknown 13830 1727204132.50354: calling self._execute() 13830 1727204132.50454: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204132.50459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204132.50470: variable 'omit' from source: magic vars 13830 1727204132.50853: variable 'ansible_distribution_major_version' from source: facts 13830 1727204132.50866: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204132.50872: variable 'omit' from source: magic vars 13830 1727204132.50946: variable 'omit' from source: magic vars 13830 1727204132.51111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13830 1727204132.53424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13830 1727204132.53491: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13830 1727204132.53531: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13830 1727204132.53568: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13830 1727204132.53594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13830 1727204132.53683: variable 'network_provider' from source: set_fact 13830 1727204132.53814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13830 1727204132.53844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13830 1727204132.53876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13830 1727204132.53915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13830 1727204132.53929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13830 1727204132.54010: variable 'omit' from source: magic vars 13830 1727204132.54123: variable 'omit' from source: magic vars 13830 1727204132.54229: variable 'network_connections' from source: task vars 13830 1727204132.54241: variable 'port2_profile' from source: play vars 13830 1727204132.54305: variable 'port2_profile' from source: play vars 13830 1727204132.54313: variable 'port1_profile' from source: play vars 13830 1727204132.54374: variable 'port1_profile' from source: play vars 13830 1727204132.54387: variable 'controller_profile' from source: play vars 13830 1727204132.54448: variable 'controller_profile' from source: play vars 13830 1727204132.54623: variable 'omit' from source: magic vars 13830 1727204132.54631: variable '__lsr_ansible_managed' from source: task vars 13830 1727204132.54691: variable '__lsr_ansible_managed' from source: task vars 13830 1727204132.54889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13830 1727204132.55480: Loaded config def from plugin (lookup/template) 13830 1727204132.55483: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13830 1727204132.55513: File lookup term: get_ansible_managed.j2 13830 1727204132.55517: variable 'ansible_search_path' from source: unknown 13830 1727204132.55520: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13830 1727204132.55537: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13830 1727204132.55550: variable 'ansible_search_path' from source: unknown 13830 1727204132.61786: variable 'ansible_managed' from source: unknown 13830 1727204132.61944: variable 'omit' from source: magic vars 13830 1727204132.61982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204132.62006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204132.62024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204132.62041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204132.62051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204132.62107: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204132.62111: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204132.62113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204132.62213: Set connection var ansible_connection to ssh 13830 1727204132.62220: Set connection var ansible_timeout to 10 13830 1727204132.62227: Set connection var ansible_shell_executable to /bin/sh 13830 1727204132.62229: Set connection var ansible_shell_type to sh 13830 1727204132.62237: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204132.62244: Set connection var ansible_pipelining to False 13830 1727204132.62272: variable 'ansible_shell_executable' from source: unknown 13830 1727204132.62275: variable 'ansible_connection' from source: unknown 13830 1727204132.62283: variable 'ansible_module_compression' from source: unknown 13830 1727204132.62286: variable 'ansible_shell_type' from source: unknown 13830 1727204132.62288: variable 'ansible_shell_executable' from source: unknown 13830 1727204132.62292: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204132.62296: variable 'ansible_pipelining' from source: unknown 13830 1727204132.62300: variable 'ansible_timeout' from source: unknown 13830 1727204132.62310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204132.62442: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204132.62452: variable 'omit' from source: magic vars 13830 1727204132.62459: starting attempt loop 13830 1727204132.62462: running the handler 13830 1727204132.62478: _low_level_execute_command(): starting 13830 1727204132.62484: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204132.63585: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204132.63588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.63590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.63592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.63594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.63599: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204132.63601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.63603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204132.63605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204132.63607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204132.63609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.63611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.63613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.63615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.63617: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204132.63618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.63620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204132.63621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204132.63623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204132.63624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204132.65236: stdout chunk (state=3): >>>/root <<< 13830 1727204132.65331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204132.65423: stderr chunk (state=3): >>><<< 13830 1727204132.65429: stdout chunk (state=3): >>><<< 13830 1727204132.65459: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204132.65479: _low_level_execute_command(): starting 13830 1727204132.65486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736 `" && echo ansible-tmp-1727204132.6545897-18592-240732136739736="` echo /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736 `" ) && sleep 0' 13830 1727204132.66148: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204132.66157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.66170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.66184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.66222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.66229: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204132.66247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.66260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204132.66273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204132.66281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204132.66288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.66297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.66310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.66315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.66322: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204132.66331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.66408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204132.66428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204132.66440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204132.66519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204132.68356: stdout chunk (state=3): >>>ansible-tmp-1727204132.6545897-18592-240732136739736=/root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736 <<< 13830 1727204132.68541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204132.68545: stdout chunk (state=3): >>><<< 13830 1727204132.68552: stderr chunk (state=3): >>><<< 13830 1727204132.68575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204132.6545897-18592-240732136739736=/root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204132.68622: variable 'ansible_module_compression' from source: unknown 13830 1727204132.68669: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13830 1727204132.68705: variable 'ansible_facts' from source: unknown 13830 1727204132.68797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736/AnsiballZ_network_connections.py 13830 1727204132.68940: Sending initial data 13830 1727204132.68946: Sent initial data (168 bytes) 13830 1727204132.69889: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204132.69897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.69908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.69922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.69959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.69967: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204132.69982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.69991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204132.69999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204132.70005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204132.70013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.70023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.70037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.70040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.70048: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204132.70057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.70132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204132.70147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204132.70154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204132.70339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204132.71913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204132.71951: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204132.71988: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp_qnnhkpe /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736/AnsiballZ_network_connections.py <<< 13830 1727204132.72181: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204132.73837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204132.73930: stderr chunk (state=3): >>><<< 13830 1727204132.73934: stdout chunk (state=3): >>><<< 13830 1727204132.73962: done transferring module to remote 13830 1727204132.73976: _low_level_execute_command(): starting 13830 1727204132.73983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736/ /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736/AnsiballZ_network_connections.py && sleep 0' 13830 1727204132.74698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204132.74709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.74728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.74745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.74786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.74794: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204132.74803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.74817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204132.74831: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204132.74842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204132.74852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204132.74868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.74881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.74889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204132.74896: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204132.74906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.74990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204132.75004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204132.75014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204132.75087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204132.76868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204132.76872: stdout chunk (state=3): >>><<< 13830 1727204132.76875: stderr chunk (state=3): >>><<< 13830 1727204132.76972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204132.76984: _low_level_execute_command(): starting 13830 1727204132.76987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736/AnsiballZ_network_connections.py && sleep 0' 13830 1727204132.78271: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204132.78275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204132.78403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204132.78407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13830 1727204132.78420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204132.78501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204132.78504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204132.78686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204132.78801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204133.30912: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/12faa5fe-601d-4179-9d4d-c366327061e9: error=unknown <<< 13830 1727204133.32815: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/e8dfd867-3423-41f0-b2ba-561a2a4c7934: error=unknown <<< 13830 1727204133.34840: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/aaf1f626-5889-4bd6-9eb8-491a8b173119: error=unknown <<< 13830 1727204133.35020: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13830 1727204133.36701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204133.36706: stdout chunk (state=3): >>><<< 13830 1727204133.36711: stderr chunk (state=3): >>><<< 13830 1727204133.36736: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/12faa5fe-601d-4179-9d4d-c366327061e9: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/e8dfd867-3423-41f0-b2ba-561a2a4c7934: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_1_j9hw18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/aaf1f626-5889-4bd6-9eb8-491a8b173119: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204133.36785: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204133.36794: _low_level_execute_command(): starting 13830 1727204133.36799: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204132.6545897-18592-240732136739736/ > /dev/null 2>&1 && sleep 0' 13830 1727204133.37468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204133.37477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.37487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.37502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.37542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.37549: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204133.37559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.37580: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204133.37587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204133.37593: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204133.37602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.37609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.37622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.37629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.37638: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204133.37644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.37796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204133.37799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204133.37802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204133.38454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204133.40086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204133.40097: stderr chunk (state=3): >>><<< 13830 1727204133.40101: stdout chunk (state=3): >>><<< 13830 1727204133.40120: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204133.40126: handler run complete 13830 1727204133.40159: attempt loop complete, returning result 13830 1727204133.40163: _execute() done 13830 1727204133.40169: dumping result to json 13830 1727204133.40171: done dumping result, returning 13830 1727204133.40180: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-1659-6b02-000000000e1a] 13830 1727204133.40186: sending task result for task 0affcd87-79f5-1659-6b02-000000000e1a 13830 1727204133.40310: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e1a 13830 1727204133.40313: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13830 1727204133.40432: no more pending results, returning what we have 13830 1727204133.40439: results queue empty 13830 1727204133.40439: checking for any_errors_fatal 13830 1727204133.40446: done checking for any_errors_fatal 13830 1727204133.40447: checking for max_fail_percentage 13830 1727204133.40449: done checking for max_fail_percentage 13830 1727204133.40450: checking to see if all hosts have failed and the running result is not ok 13830 1727204133.40450: done checking to see if all hosts have failed 13830 1727204133.40451: getting the remaining hosts for this loop 13830 1727204133.40453: done getting the remaining hosts for this loop 13830 1727204133.40456: getting the next task for host managed-node3 13830 1727204133.40465: done getting next task for host managed-node3 13830 1727204133.40469: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204133.40474: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204133.40491: getting variables 13830 1727204133.40492: in VariableManager get_vars() 13830 1727204133.40536: Calling all_inventory to load vars for managed-node3 13830 1727204133.40539: Calling groups_inventory to load vars for managed-node3 13830 1727204133.40541: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204133.40551: Calling all_plugins_play to load vars for managed-node3 13830 1727204133.40553: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204133.40555: Calling groups_plugins_play to load vars for managed-node3 13830 1727204133.43071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204133.46825: done with get_vars() 13830 1727204133.46867: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.975) 0:01:06.547 ***** 13830 1727204133.46971: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204133.47324: worker is 1 (out of 1 available) 13830 1727204133.47339: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 13830 1727204133.47351: done queuing things up, now waiting for results queue to drain 13830 1727204133.47352: waiting for pending results... 13830 1727204133.47682: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 13830 1727204133.47862: in run() - task 0affcd87-79f5-1659-6b02-000000000e1b 13830 1727204133.47888: variable 'ansible_search_path' from source: unknown 13830 1727204133.47911: variable 'ansible_search_path' from source: unknown 13830 1727204133.47959: calling self._execute() 13830 1727204133.48107: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.48119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.48141: variable 'omit' from source: magic vars 13830 1727204133.48638: variable 'ansible_distribution_major_version' from source: facts 13830 1727204133.48658: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204133.48787: variable 'network_state' from source: role '' defaults 13830 1727204133.48797: Evaluated conditional (network_state != {}): False 13830 1727204133.48800: when evaluation is False, skipping this task 13830 1727204133.48803: _execute() done 13830 1727204133.48805: dumping result to json 13830 1727204133.48810: done dumping result, returning 13830 1727204133.48816: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-1659-6b02-000000000e1b] 13830 1727204133.48825: sending task result for task 0affcd87-79f5-1659-6b02-000000000e1b 13830 1727204133.48926: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e1b 13830 1727204133.48928: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13830 1727204133.48984: no more pending results, returning what we have 13830 1727204133.48987: results queue empty 13830 1727204133.48988: checking for any_errors_fatal 13830 1727204133.49005: done checking for any_errors_fatal 13830 1727204133.49005: checking for max_fail_percentage 13830 1727204133.49007: done checking for max_fail_percentage 13830 1727204133.49008: checking to see if all hosts have failed and the running result is not ok 13830 1727204133.49009: done checking to see if all hosts have failed 13830 1727204133.49010: getting the remaining hosts for this loop 13830 1727204133.49011: done getting the remaining hosts for this loop 13830 1727204133.49015: getting the next task for host managed-node3 13830 1727204133.49025: done getting next task for host managed-node3 13830 1727204133.49029: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204133.49037: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204133.49060: getting variables 13830 1727204133.49062: in VariableManager get_vars() 13830 1727204133.49103: Calling all_inventory to load vars for managed-node3 13830 1727204133.49106: Calling groups_inventory to load vars for managed-node3 13830 1727204133.49108: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204133.49118: Calling all_plugins_play to load vars for managed-node3 13830 1727204133.49120: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204133.49123: Calling groups_plugins_play to load vars for managed-node3 13830 1727204133.50836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204133.52852: done with get_vars() 13830 1727204133.52897: done getting variables 13830 1727204133.52981: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.060) 0:01:06.608 ***** 13830 1727204133.53032: entering _queue_task() for managed-node3/debug 13830 1727204133.53415: worker is 1 (out of 1 available) 13830 1727204133.53429: exiting _queue_task() for managed-node3/debug 13830 1727204133.53449: done queuing things up, now waiting for results queue to drain 13830 1727204133.53451: waiting for pending results... 13830 1727204133.53812: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13830 1727204133.54032: in run() - task 0affcd87-79f5-1659-6b02-000000000e1c 13830 1727204133.54056: variable 'ansible_search_path' from source: unknown 13830 1727204133.54066: variable 'ansible_search_path' from source: unknown 13830 1727204133.54117: calling self._execute() 13830 1727204133.54261: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.54276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.54291: variable 'omit' from source: magic vars 13830 1727204133.54842: variable 'ansible_distribution_major_version' from source: facts 13830 1727204133.54870: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204133.54882: variable 'omit' from source: magic vars 13830 1727204133.54965: variable 'omit' from source: magic vars 13830 1727204133.55015: variable 'omit' from source: magic vars 13830 1727204133.55068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204133.55116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204133.55144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204133.55167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204133.55184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204133.55225: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204133.55233: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.55243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.55358: Set connection var ansible_connection to ssh 13830 1727204133.55376: Set connection var ansible_timeout to 10 13830 1727204133.55386: Set connection var ansible_shell_executable to /bin/sh 13830 1727204133.55392: Set connection var ansible_shell_type to sh 13830 1727204133.55402: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204133.55420: Set connection var ansible_pipelining to False 13830 1727204133.55456: variable 'ansible_shell_executable' from source: unknown 13830 1727204133.55465: variable 'ansible_connection' from source: unknown 13830 1727204133.55473: variable 'ansible_module_compression' from source: unknown 13830 1727204133.55479: variable 'ansible_shell_type' from source: unknown 13830 1727204133.55485: variable 'ansible_shell_executable' from source: unknown 13830 1727204133.55491: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.55498: variable 'ansible_pipelining' from source: unknown 13830 1727204133.55503: variable 'ansible_timeout' from source: unknown 13830 1727204133.55510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.55674: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204133.55691: variable 'omit' from source: magic vars 13830 1727204133.55701: starting attempt loop 13830 1727204133.55707: running the handler 13830 1727204133.55857: variable '__network_connections_result' from source: set_fact 13830 1727204133.55947: handler run complete 13830 1727204133.55978: attempt loop complete, returning result 13830 1727204133.55985: _execute() done 13830 1727204133.55992: dumping result to json 13830 1727204133.55999: done dumping result, returning 13830 1727204133.56013: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-1659-6b02-000000000e1c] 13830 1727204133.56022: sending task result for task 0affcd87-79f5-1659-6b02-000000000e1c ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13830 1727204133.56217: no more pending results, returning what we have 13830 1727204133.56221: results queue empty 13830 1727204133.56222: checking for any_errors_fatal 13830 1727204133.56230: done checking for any_errors_fatal 13830 1727204133.56230: checking for max_fail_percentage 13830 1727204133.56232: done checking for max_fail_percentage 13830 1727204133.56236: checking to see if all hosts have failed and the running result is not ok 13830 1727204133.56237: done checking to see if all hosts have failed 13830 1727204133.56237: getting the remaining hosts for this loop 13830 1727204133.56240: done getting the remaining hosts for this loop 13830 1727204133.56246: getting the next task for host managed-node3 13830 1727204133.56254: done getting next task for host managed-node3 13830 1727204133.56258: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204133.56265: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204133.56280: getting variables 13830 1727204133.56282: in VariableManager get_vars() 13830 1727204133.56333: Calling all_inventory to load vars for managed-node3 13830 1727204133.56338: Calling groups_inventory to load vars for managed-node3 13830 1727204133.56341: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204133.56352: Calling all_plugins_play to load vars for managed-node3 13830 1727204133.56355: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204133.56358: Calling groups_plugins_play to load vars for managed-node3 13830 1727204133.57305: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e1c 13830 1727204133.57308: WORKER PROCESS EXITING 13830 1727204133.57812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204133.58756: done with get_vars() 13830 1727204133.58787: done getting variables 13830 1727204133.58842: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.058) 0:01:06.667 ***** 13830 1727204133.58897: entering _queue_task() for managed-node3/debug 13830 1727204133.59249: worker is 1 (out of 1 available) 13830 1727204133.59263: exiting _queue_task() for managed-node3/debug 13830 1727204133.59277: done queuing things up, now waiting for results queue to drain 13830 1727204133.59279: waiting for pending results... 13830 1727204133.59591: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13830 1727204133.59744: in run() - task 0affcd87-79f5-1659-6b02-000000000e1d 13830 1727204133.59758: variable 'ansible_search_path' from source: unknown 13830 1727204133.59762: variable 'ansible_search_path' from source: unknown 13830 1727204133.59814: calling self._execute() 13830 1727204133.59906: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.59911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.59920: variable 'omit' from source: magic vars 13830 1727204133.60326: variable 'ansible_distribution_major_version' from source: facts 13830 1727204133.60348: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204133.60351: variable 'omit' from source: magic vars 13830 1727204133.60399: variable 'omit' from source: magic vars 13830 1727204133.60426: variable 'omit' from source: magic vars 13830 1727204133.60469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204133.60498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204133.60514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204133.60526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204133.60537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204133.60566: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204133.60572: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.60578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.60650: Set connection var ansible_connection to ssh 13830 1727204133.60658: Set connection var ansible_timeout to 10 13830 1727204133.60663: Set connection var ansible_shell_executable to /bin/sh 13830 1727204133.60668: Set connection var ansible_shell_type to sh 13830 1727204133.60673: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204133.60681: Set connection var ansible_pipelining to False 13830 1727204133.60699: variable 'ansible_shell_executable' from source: unknown 13830 1727204133.60703: variable 'ansible_connection' from source: unknown 13830 1727204133.60706: variable 'ansible_module_compression' from source: unknown 13830 1727204133.60708: variable 'ansible_shell_type' from source: unknown 13830 1727204133.60710: variable 'ansible_shell_executable' from source: unknown 13830 1727204133.60712: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.60714: variable 'ansible_pipelining' from source: unknown 13830 1727204133.60716: variable 'ansible_timeout' from source: unknown 13830 1727204133.60724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.60824: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204133.60835: variable 'omit' from source: magic vars 13830 1727204133.60842: starting attempt loop 13830 1727204133.60845: running the handler 13830 1727204133.60884: variable '__network_connections_result' from source: set_fact 13830 1727204133.60945: variable '__network_connections_result' from source: set_fact 13830 1727204133.61026: handler run complete 13830 1727204133.61047: attempt loop complete, returning result 13830 1727204133.61050: _execute() done 13830 1727204133.61052: dumping result to json 13830 1727204133.61055: done dumping result, returning 13830 1727204133.61063: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-1659-6b02-000000000e1d] 13830 1727204133.61070: sending task result for task 0affcd87-79f5-1659-6b02-000000000e1d ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13830 1727204133.61260: no more pending results, returning what we have 13830 1727204133.61266: results queue empty 13830 1727204133.61267: checking for any_errors_fatal 13830 1727204133.61279: done checking for any_errors_fatal 13830 1727204133.61280: checking for max_fail_percentage 13830 1727204133.61281: done checking for max_fail_percentage 13830 1727204133.61282: checking to see if all hosts have failed and the running result is not ok 13830 1727204133.61283: done checking to see if all hosts have failed 13830 1727204133.61284: getting the remaining hosts for this loop 13830 1727204133.61285: done getting the remaining hosts for this loop 13830 1727204133.61289: getting the next task for host managed-node3 13830 1727204133.61297: done getting next task for host managed-node3 13830 1727204133.61300: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204133.61304: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204133.61317: getting variables 13830 1727204133.61318: in VariableManager get_vars() 13830 1727204133.61359: Calling all_inventory to load vars for managed-node3 13830 1727204133.61361: Calling groups_inventory to load vars for managed-node3 13830 1727204133.61365: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204133.61379: Calling all_plugins_play to load vars for managed-node3 13830 1727204133.61382: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204133.61384: Calling groups_plugins_play to load vars for managed-node3 13830 1727204133.61923: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e1d 13830 1727204133.61927: WORKER PROCESS EXITING 13830 1727204133.62220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204133.64090: done with get_vars() 13830 1727204133.64123: done getting variables 13830 1727204133.64201: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.053) 0:01:06.720 ***** 13830 1727204133.64252: entering _queue_task() for managed-node3/debug 13830 1727204133.64629: worker is 1 (out of 1 available) 13830 1727204133.64646: exiting _queue_task() for managed-node3/debug 13830 1727204133.64658: done queuing things up, now waiting for results queue to drain 13830 1727204133.64659: waiting for pending results... 13830 1727204133.64999: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13830 1727204133.65191: in run() - task 0affcd87-79f5-1659-6b02-000000000e1e 13830 1727204133.65214: variable 'ansible_search_path' from source: unknown 13830 1727204133.65228: variable 'ansible_search_path' from source: unknown 13830 1727204133.65277: calling self._execute() 13830 1727204133.65390: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.65402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.65417: variable 'omit' from source: magic vars 13830 1727204133.65824: variable 'ansible_distribution_major_version' from source: facts 13830 1727204133.65845: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204133.65987: variable 'network_state' from source: role '' defaults 13830 1727204133.66013: Evaluated conditional (network_state != {}): False 13830 1727204133.66022: when evaluation is False, skipping this task 13830 1727204133.66029: _execute() done 13830 1727204133.66039: dumping result to json 13830 1727204133.66047: done dumping result, returning 13830 1727204133.66057: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-1659-6b02-000000000e1e] 13830 1727204133.66072: sending task result for task 0affcd87-79f5-1659-6b02-000000000e1e 13830 1727204133.66200: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e1e skipping: [managed-node3] => { "false_condition": "network_state != {}" } 13830 1727204133.66260: no more pending results, returning what we have 13830 1727204133.66266: results queue empty 13830 1727204133.66267: checking for any_errors_fatal 13830 1727204133.66277: done checking for any_errors_fatal 13830 1727204133.66279: checking for max_fail_percentage 13830 1727204133.66281: done checking for max_fail_percentage 13830 1727204133.66282: checking to see if all hosts have failed and the running result is not ok 13830 1727204133.66283: done checking to see if all hosts have failed 13830 1727204133.66284: getting the remaining hosts for this loop 13830 1727204133.66286: done getting the remaining hosts for this loop 13830 1727204133.66290: getting the next task for host managed-node3 13830 1727204133.66301: done getting next task for host managed-node3 13830 1727204133.66306: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204133.66312: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204133.66342: getting variables 13830 1727204133.66345: in VariableManager get_vars() 13830 1727204133.66398: Calling all_inventory to load vars for managed-node3 13830 1727204133.66401: Calling groups_inventory to load vars for managed-node3 13830 1727204133.66403: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204133.66416: Calling all_plugins_play to load vars for managed-node3 13830 1727204133.66419: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204133.66422: Calling groups_plugins_play to load vars for managed-node3 13830 1727204133.67390: WORKER PROCESS EXITING 13830 1727204133.68479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204133.70390: done with get_vars() 13830 1727204133.70426: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:33 -0400 (0:00:00.063) 0:01:06.783 ***** 13830 1727204133.70554: entering _queue_task() for managed-node3/ping 13830 1727204133.70967: worker is 1 (out of 1 available) 13830 1727204133.70980: exiting _queue_task() for managed-node3/ping 13830 1727204133.70992: done queuing things up, now waiting for results queue to drain 13830 1727204133.70993: waiting for pending results... 13830 1727204133.71340: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13830 1727204133.71514: in run() - task 0affcd87-79f5-1659-6b02-000000000e1f 13830 1727204133.71538: variable 'ansible_search_path' from source: unknown 13830 1727204133.71546: variable 'ansible_search_path' from source: unknown 13830 1727204133.71596: calling self._execute() 13830 1727204133.71707: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.71717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.71730: variable 'omit' from source: magic vars 13830 1727204133.72165: variable 'ansible_distribution_major_version' from source: facts 13830 1727204133.72185: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204133.72197: variable 'omit' from source: magic vars 13830 1727204133.72291: variable 'omit' from source: magic vars 13830 1727204133.72345: variable 'omit' from source: magic vars 13830 1727204133.72396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204133.72450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204133.72480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204133.72502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204133.72518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204133.72567: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204133.72576: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.72584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.72700: Set connection var ansible_connection to ssh 13830 1727204133.72716: Set connection var ansible_timeout to 10 13830 1727204133.72726: Set connection var ansible_shell_executable to /bin/sh 13830 1727204133.72732: Set connection var ansible_shell_type to sh 13830 1727204133.72746: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204133.72766: Set connection var ansible_pipelining to False 13830 1727204133.72799: variable 'ansible_shell_executable' from source: unknown 13830 1727204133.72806: variable 'ansible_connection' from source: unknown 13830 1727204133.72815: variable 'ansible_module_compression' from source: unknown 13830 1727204133.72821: variable 'ansible_shell_type' from source: unknown 13830 1727204133.72827: variable 'ansible_shell_executable' from source: unknown 13830 1727204133.72836: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204133.72844: variable 'ansible_pipelining' from source: unknown 13830 1727204133.72849: variable 'ansible_timeout' from source: unknown 13830 1727204133.72856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204133.73082: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 13830 1727204133.73104: variable 'omit' from source: magic vars 13830 1727204133.73113: starting attempt loop 13830 1727204133.73118: running the handler 13830 1727204133.73137: _low_level_execute_command(): starting 13830 1727204133.73150: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204133.73980: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204133.73997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.74014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.74036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.74094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.74107: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204133.74121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.74142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204133.74154: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204133.74168: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204133.74183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.74203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.74219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.74231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.74247: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204133.74262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.74353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204133.74381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204133.74403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204133.74490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204133.76119: stdout chunk (state=3): >>>/root <<< 13830 1727204133.76213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204133.76325: stderr chunk (state=3): >>><<< 13830 1727204133.76342: stdout chunk (state=3): >>><<< 13830 1727204133.76471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204133.76475: _low_level_execute_command(): starting 13830 1727204133.76478: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144 `" && echo ansible-tmp-1727204133.763816-18635-23717472897144="` echo /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144 `" ) && sleep 0' 13830 1727204133.77129: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204133.77142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.77161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.77179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.77220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.77227: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204133.77241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.77256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204133.77277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204133.77284: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204133.77292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.77302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.77314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.77322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.77329: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204133.77342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.77428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204133.77450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204133.77463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204133.77540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204133.79367: stdout chunk (state=3): >>>ansible-tmp-1727204133.763816-18635-23717472897144=/root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144 <<< 13830 1727204133.79473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204133.79570: stderr chunk (state=3): >>><<< 13830 1727204133.79576: stdout chunk (state=3): >>><<< 13830 1727204133.79615: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204133.763816-18635-23717472897144=/root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204133.79668: variable 'ansible_module_compression' from source: unknown 13830 1727204133.79717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13830 1727204133.79756: variable 'ansible_facts' from source: unknown 13830 1727204133.79834: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144/AnsiballZ_ping.py 13830 1727204133.80005: Sending initial data 13830 1727204133.80009: Sent initial data (151 bytes) 13830 1727204133.81119: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204133.81128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.81141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.81155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.81208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.81216: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204133.81226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.81243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204133.81250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204133.81257: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204133.81266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.81284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.81305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.81313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.81321: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204133.81335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.81419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204133.81442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204133.81457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204133.81541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204133.83232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204133.83273: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204133.83318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp73vd0h01 /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144/AnsiballZ_ping.py <<< 13830 1727204133.83358: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204133.84455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204133.84541: stderr chunk (state=3): >>><<< 13830 1727204133.84545: stdout chunk (state=3): >>><<< 13830 1727204133.84570: done transferring module to remote 13830 1727204133.84584: _low_level_execute_command(): starting 13830 1727204133.84592: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144/ /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144/AnsiballZ_ping.py && sleep 0' 13830 1727204133.85263: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204133.85279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.85290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.85304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.85345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.85352: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204133.85364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.85379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204133.85387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204133.85394: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204133.85401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.85411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.85427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.85437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.85440: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204133.85450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.85524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204133.85544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204133.85557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204133.85627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204133.87373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204133.87377: stdout chunk (state=3): >>><<< 13830 1727204133.87382: stderr chunk (state=3): >>><<< 13830 1727204133.87405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204133.87409: _low_level_execute_command(): starting 13830 1727204133.87412: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144/AnsiballZ_ping.py && sleep 0' 13830 1727204133.88123: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204133.88140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.88149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.88173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.88214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.88221: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204133.88231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.88250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204133.88257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204133.88274: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204133.88282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204133.88292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204133.88305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204133.88312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204133.88319: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204133.88328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204133.88407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204133.88426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204133.88439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204133.88514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.01201: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13830 1727204134.02187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204134.02248: stderr chunk (state=3): >>><<< 13830 1727204134.02252: stdout chunk (state=3): >>><<< 13830 1727204134.02272: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204134.02298: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204134.02310: _low_level_execute_command(): starting 13830 1727204134.02312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204133.763816-18635-23717472897144/ > /dev/null 2>&1 && sleep 0' 13830 1727204134.03098: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.03105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.03116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.03130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.03178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.03184: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.03194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.03206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.03214: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.03221: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.03229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.03243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.03259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.03267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.03274: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.03283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.03358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.03377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.03387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.03454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.05272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.05280: stdout chunk (state=3): >>><<< 13830 1727204134.05291: stderr chunk (state=3): >>><<< 13830 1727204134.05305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.05312: handler run complete 13830 1727204134.05328: attempt loop complete, returning result 13830 1727204134.05331: _execute() done 13830 1727204134.05334: dumping result to json 13830 1727204134.05340: done dumping result, returning 13830 1727204134.05350: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-1659-6b02-000000000e1f] 13830 1727204134.05356: sending task result for task 0affcd87-79f5-1659-6b02-000000000e1f 13830 1727204134.05456: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e1f 13830 1727204134.05459: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 13830 1727204134.05543: no more pending results, returning what we have 13830 1727204134.05548: results queue empty 13830 1727204134.05549: checking for any_errors_fatal 13830 1727204134.05558: done checking for any_errors_fatal 13830 1727204134.05558: checking for max_fail_percentage 13830 1727204134.05561: done checking for max_fail_percentage 13830 1727204134.05562: checking to see if all hosts have failed and the running result is not ok 13830 1727204134.05563: done checking to see if all hosts have failed 13830 1727204134.05566: getting the remaining hosts for this loop 13830 1727204134.05568: done getting the remaining hosts for this loop 13830 1727204134.05573: getting the next task for host managed-node3 13830 1727204134.05587: done getting next task for host managed-node3 13830 1727204134.05589: ^ task is: TASK: meta (role_complete) 13830 1727204134.05595: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204134.05610: getting variables 13830 1727204134.05613: in VariableManager get_vars() 13830 1727204134.05681: Calling all_inventory to load vars for managed-node3 13830 1727204134.05684: Calling groups_inventory to load vars for managed-node3 13830 1727204134.05687: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204134.05698: Calling all_plugins_play to load vars for managed-node3 13830 1727204134.05701: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204134.05705: Calling groups_plugins_play to load vars for managed-node3 13830 1727204134.07824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204134.09752: done with get_vars() 13830 1727204134.09780: done getting variables 13830 1727204134.09873: done queuing things up, now waiting for results queue to drain 13830 1727204134.09875: results queue empty 13830 1727204134.09876: checking for any_errors_fatal 13830 1727204134.09879: done checking for any_errors_fatal 13830 1727204134.09880: checking for max_fail_percentage 13830 1727204134.09881: done checking for max_fail_percentage 13830 1727204134.09881: checking to see if all hosts have failed and the running result is not ok 13830 1727204134.09882: done checking to see if all hosts have failed 13830 1727204134.09883: getting the remaining hosts for this loop 13830 1727204134.09884: done getting the remaining hosts for this loop 13830 1727204134.09892: getting the next task for host managed-node3 13830 1727204134.09896: done getting next task for host managed-node3 13830 1727204134.09898: ^ task is: TASK: Delete the device '{{ controller_device }}' 13830 1727204134.09901: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204134.09904: getting variables 13830 1727204134.09905: in VariableManager get_vars() 13830 1727204134.09922: Calling all_inventory to load vars for managed-node3 13830 1727204134.09924: Calling groups_inventory to load vars for managed-node3 13830 1727204134.09926: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204134.09931: Calling all_plugins_play to load vars for managed-node3 13830 1727204134.09933: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204134.09936: Calling groups_plugins_play to load vars for managed-node3 13830 1727204134.11024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204134.12741: done with get_vars() 13830 1727204134.12774: done getting variables 13830 1727204134.12830: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 13830 1727204134.12965: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.424) 0:01:07.208 ***** 13830 1727204134.13003: entering _queue_task() for managed-node3/command 13830 1727204134.13386: worker is 1 (out of 1 available) 13830 1727204134.13401: exiting _queue_task() for managed-node3/command 13830 1727204134.13414: done queuing things up, now waiting for results queue to drain 13830 1727204134.13416: waiting for pending results... 13830 1727204134.13720: running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' 13830 1727204134.13850: in run() - task 0affcd87-79f5-1659-6b02-000000000e4f 13830 1727204134.13876: variable 'ansible_search_path' from source: unknown 13830 1727204134.13884: variable 'ansible_search_path' from source: unknown 13830 1727204134.13922: calling self._execute() 13830 1727204134.14030: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204134.14041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204134.14054: variable 'omit' from source: magic vars 13830 1727204134.14423: variable 'ansible_distribution_major_version' from source: facts 13830 1727204134.14441: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204134.14451: variable 'omit' from source: magic vars 13830 1727204134.14479: variable 'omit' from source: magic vars 13830 1727204134.14582: variable 'controller_device' from source: play vars 13830 1727204134.14603: variable 'omit' from source: magic vars 13830 1727204134.14654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204134.14695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204134.14719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204134.14745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204134.14760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204134.14795: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204134.14802: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204134.14809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204134.14915: Set connection var ansible_connection to ssh 13830 1727204134.14930: Set connection var ansible_timeout to 10 13830 1727204134.14939: Set connection var ansible_shell_executable to /bin/sh 13830 1727204134.14950: Set connection var ansible_shell_type to sh 13830 1727204134.14960: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204134.14975: Set connection var ansible_pipelining to False 13830 1727204134.15000: variable 'ansible_shell_executable' from source: unknown 13830 1727204134.15008: variable 'ansible_connection' from source: unknown 13830 1727204134.15015: variable 'ansible_module_compression' from source: unknown 13830 1727204134.15022: variable 'ansible_shell_type' from source: unknown 13830 1727204134.15027: variable 'ansible_shell_executable' from source: unknown 13830 1727204134.15033: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204134.15040: variable 'ansible_pipelining' from source: unknown 13830 1727204134.15046: variable 'ansible_timeout' from source: unknown 13830 1727204134.15059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204134.15198: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204134.15215: variable 'omit' from source: magic vars 13830 1727204134.15224: starting attempt loop 13830 1727204134.15230: running the handler 13830 1727204134.15249: _low_level_execute_command(): starting 13830 1727204134.15260: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204134.15995: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.16008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.16024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.16046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.16095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.16109: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.16125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.16148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.16161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.16176: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.16189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.16203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.16219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.16232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.16245: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.16263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.16341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.16358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.16378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.16462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.18041: stdout chunk (state=3): >>>/root <<< 13830 1727204134.18141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.18215: stderr chunk (state=3): >>><<< 13830 1727204134.18221: stdout chunk (state=3): >>><<< 13830 1727204134.18256: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.18269: _low_level_execute_command(): starting 13830 1727204134.18275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085 `" && echo ansible-tmp-1727204134.1825466-18651-70002261381085="` echo /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085 `" ) && sleep 0' 13830 1727204134.18922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.18931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.18944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.18958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.18999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.19006: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.19016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.19029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.19039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.19046: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.19054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.19063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.19078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.19086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.19092: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.19102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.19178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.19192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.19202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.19278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.21100: stdout chunk (state=3): >>>ansible-tmp-1727204134.1825466-18651-70002261381085=/root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085 <<< 13830 1727204134.21310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.21314: stdout chunk (state=3): >>><<< 13830 1727204134.21317: stderr chunk (state=3): >>><<< 13830 1727204134.21644: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204134.1825466-18651-70002261381085=/root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.21648: variable 'ansible_module_compression' from source: unknown 13830 1727204134.21651: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204134.21653: variable 'ansible_facts' from source: unknown 13830 1727204134.21656: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085/AnsiballZ_command.py 13830 1727204134.21721: Sending initial data 13830 1727204134.21725: Sent initial data (155 bytes) 13830 1727204134.22708: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.22723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.22744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.22765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.22807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.22821: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.22840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.22859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.22873: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.22885: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.22897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.22911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.22925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.22936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.22951: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.22969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.23087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.23091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.23152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.24842: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204134.24861: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204134.24896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpr30oc51q /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085/AnsiballZ_command.py <<< 13830 1727204134.24934: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204134.26030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.26171: stderr chunk (state=3): >>><<< 13830 1727204134.26174: stdout chunk (state=3): >>><<< 13830 1727204134.26177: done transferring module to remote 13830 1727204134.26186: _low_level_execute_command(): starting 13830 1727204134.26193: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085/ /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085/AnsiballZ_command.py && sleep 0' 13830 1727204134.26640: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.26645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.26697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204134.26701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.26704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204134.26706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.26762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.26809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.26885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.28644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.28729: stderr chunk (state=3): >>><<< 13830 1727204134.28743: stdout chunk (state=3): >>><<< 13830 1727204134.28853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.28857: _low_level_execute_command(): starting 13830 1727204134.28860: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085/AnsiballZ_command.py && sleep 0' 13830 1727204134.29806: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.29893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.29905: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.29919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.29941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.29955: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.29979: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.29994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.30008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.30025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.30040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.30052: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.30066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.30147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.30171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.30195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.30275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.44003: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:34.432111", "end": "2024-09-24 14:55:34.438904", "delta": "0:00:00.006793", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204134.45149: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. <<< 13830 1727204134.45154: stdout chunk (state=3): >>><<< 13830 1727204134.45156: stderr chunk (state=3): >>><<< 13830 1727204134.45322: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:34.432111", "end": "2024-09-24 14:55:34.438904", "delta": "0:00:00.006793", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. 13830 1727204134.45331: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204134.45338: _low_level_execute_command(): starting 13830 1727204134.45341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204134.1825466-18651-70002261381085/ > /dev/null 2>&1 && sleep 0' 13830 1727204134.46016: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.46032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.46051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.46084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.46133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.46150: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.46167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.46188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.46207: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.46219: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.46231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.46248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.46263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.46278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.46288: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.46305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.46390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.46407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.46429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.46543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.48794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.48798: stdout chunk (state=3): >>><<< 13830 1727204134.48801: stderr chunk (state=3): >>><<< 13830 1727204134.49075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.49079: handler run complete 13830 1727204134.49082: Evaluated conditional (False): False 13830 1727204134.49084: Evaluated conditional (False): False 13830 1727204134.49087: attempt loop complete, returning result 13830 1727204134.49089: _execute() done 13830 1727204134.49091: dumping result to json 13830 1727204134.49093: done dumping result, returning 13830 1727204134.49095: done running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' [0affcd87-79f5-1659-6b02-000000000e4f] 13830 1727204134.49097: sending task result for task 0affcd87-79f5-1659-6b02-000000000e4f 13830 1727204134.49180: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e4f 13830 1727204134.49183: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.006793", "end": "2024-09-24 14:55:34.438904", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:55:34.432111" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 13830 1727204134.49265: no more pending results, returning what we have 13830 1727204134.49270: results queue empty 13830 1727204134.49271: checking for any_errors_fatal 13830 1727204134.49273: done checking for any_errors_fatal 13830 1727204134.49274: checking for max_fail_percentage 13830 1727204134.49276: done checking for max_fail_percentage 13830 1727204134.49277: checking to see if all hosts have failed and the running result is not ok 13830 1727204134.49278: done checking to see if all hosts have failed 13830 1727204134.49279: getting the remaining hosts for this loop 13830 1727204134.49281: done getting the remaining hosts for this loop 13830 1727204134.49285: getting the next task for host managed-node3 13830 1727204134.49299: done getting next task for host managed-node3 13830 1727204134.49303: ^ task is: TASK: Remove test interfaces 13830 1727204134.49307: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204134.49314: getting variables 13830 1727204134.49316: in VariableManager get_vars() 13830 1727204134.49374: Calling all_inventory to load vars for managed-node3 13830 1727204134.49377: Calling groups_inventory to load vars for managed-node3 13830 1727204134.49380: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204134.49393: Calling all_plugins_play to load vars for managed-node3 13830 1727204134.49396: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204134.49399: Calling groups_plugins_play to load vars for managed-node3 13830 1727204134.51350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204134.53476: done with get_vars() 13830 1727204134.53511: done getting variables 13830 1727204134.53587: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.406) 0:01:07.614 ***** 13830 1727204134.53623: entering _queue_task() for managed-node3/shell 13830 1727204134.54010: worker is 1 (out of 1 available) 13830 1727204134.54024: exiting _queue_task() for managed-node3/shell 13830 1727204134.54037: done queuing things up, now waiting for results queue to drain 13830 1727204134.54038: waiting for pending results... 13830 1727204134.54343: running TaskExecutor() for managed-node3/TASK: Remove test interfaces 13830 1727204134.54479: in run() - task 0affcd87-79f5-1659-6b02-000000000e55 13830 1727204134.54511: variable 'ansible_search_path' from source: unknown 13830 1727204134.54520: variable 'ansible_search_path' from source: unknown 13830 1727204134.54561: calling self._execute() 13830 1727204134.54681: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204134.54692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204134.54717: variable 'omit' from source: magic vars 13830 1727204134.55210: variable 'ansible_distribution_major_version' from source: facts 13830 1727204134.55231: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204134.55244: variable 'omit' from source: magic vars 13830 1727204134.55306: variable 'omit' from source: magic vars 13830 1727204134.55487: variable 'dhcp_interface1' from source: play vars 13830 1727204134.55502: variable 'dhcp_interface2' from source: play vars 13830 1727204134.55528: variable 'omit' from source: magic vars 13830 1727204134.55588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204134.55635: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204134.55667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204134.55695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204134.55716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204134.55752: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204134.55761: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204134.55771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204134.55888: Set connection var ansible_connection to ssh 13830 1727204134.55909: Set connection var ansible_timeout to 10 13830 1727204134.55921: Set connection var ansible_shell_executable to /bin/sh 13830 1727204134.55932: Set connection var ansible_shell_type to sh 13830 1727204134.55943: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204134.55958: Set connection var ansible_pipelining to False 13830 1727204134.55988: variable 'ansible_shell_executable' from source: unknown 13830 1727204134.55996: variable 'ansible_connection' from source: unknown 13830 1727204134.56003: variable 'ansible_module_compression' from source: unknown 13830 1727204134.56011: variable 'ansible_shell_type' from source: unknown 13830 1727204134.56021: variable 'ansible_shell_executable' from source: unknown 13830 1727204134.56028: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204134.56040: variable 'ansible_pipelining' from source: unknown 13830 1727204134.56048: variable 'ansible_timeout' from source: unknown 13830 1727204134.56057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204134.56214: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204134.56237: variable 'omit' from source: magic vars 13830 1727204134.56247: starting attempt loop 13830 1727204134.56258: running the handler 13830 1727204134.56275: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204134.56300: _low_level_execute_command(): starting 13830 1727204134.56313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204134.57142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.57157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.57371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.57938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.57991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.58004: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.58025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.58043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.58054: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.58066: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.58077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.58089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.58102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.58111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.58120: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.58134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.58212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.58366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.58380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.58524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.60046: stdout chunk (state=3): >>>/root <<< 13830 1727204134.60243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.60247: stdout chunk (state=3): >>><<< 13830 1727204134.60249: stderr chunk (state=3): >>><<< 13830 1727204134.60374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.60387: _low_level_execute_command(): starting 13830 1727204134.60391: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307 `" && echo ansible-tmp-1727204134.6027381-18679-210871905167307="` echo /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307 `" ) && sleep 0' 13830 1727204134.61924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.61940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.61957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.61979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.62029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.62041: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.62055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.62120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.62133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.62144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.62156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.62171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.62187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.62199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.62212: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.62225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.62304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.62450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.62470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.62549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.64352: stdout chunk (state=3): >>>ansible-tmp-1727204134.6027381-18679-210871905167307=/root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307 <<< 13830 1727204134.64487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.64584: stderr chunk (state=3): >>><<< 13830 1727204134.64588: stdout chunk (state=3): >>><<< 13830 1727204134.64871: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204134.6027381-18679-210871905167307=/root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.64874: variable 'ansible_module_compression' from source: unknown 13830 1727204134.64877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204134.64879: variable 'ansible_facts' from source: unknown 13830 1727204134.64881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307/AnsiballZ_command.py 13830 1727204134.65425: Sending initial data 13830 1727204134.65429: Sent initial data (156 bytes) 13830 1727204134.67628: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204134.67647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.67665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.67689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.67740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.67752: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.67767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.67784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.67795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.67807: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204134.67826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.67840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.67856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.67872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.67884: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.67897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.67978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.68000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.68017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.68097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.69778: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204134.69816: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204134.69859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmp_ql4z8vp /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307/AnsiballZ_command.py <<< 13830 1727204134.69903: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204134.71202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.71313: stderr chunk (state=3): >>><<< 13830 1727204134.71317: stdout chunk (state=3): >>><<< 13830 1727204134.71342: done transferring module to remote 13830 1727204134.71357: _low_level_execute_command(): starting 13830 1727204134.71360: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307/ /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307/AnsiballZ_command.py && sleep 0' 13830 1727204134.72118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.72123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.72170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.72174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.72176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.72229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.72234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.72287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.73986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.74067: stderr chunk (state=3): >>><<< 13830 1727204134.74071: stdout chunk (state=3): >>><<< 13830 1727204134.74087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.74090: _low_level_execute_command(): starting 13830 1727204134.74097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307/AnsiballZ_command.py && sleep 0' 13830 1727204134.75118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.75162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13830 1727204134.75173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.75176: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204134.75189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.75289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.75332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.94451: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:34.885778", "end": "2024-09-24 14:55:34.939395", "delta": "0:00:00.053617", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204134.96190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204134.96241: stderr chunk (state=3): >>><<< 13830 1727204134.96244: stdout chunk (state=3): >>><<< 13830 1727204134.96262: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:34.885778", "end": "2024-09-24 14:55:34.939395", "delta": "0:00:00.053617", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204134.96301: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204134.96310: _low_level_execute_command(): starting 13830 1727204134.96315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204134.6027381-18679-210871905167307/ > /dev/null 2>&1 && sleep 0' 13830 1727204134.96762: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.96770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.96799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204134.96808: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204134.96814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.96828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204134.96839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204134.96842: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204134.96854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204134.96862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204134.96869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204134.96875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204134.96922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204134.96939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204134.96955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204134.97016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204134.98794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204134.98886: stderr chunk (state=3): >>><<< 13830 1727204134.98889: stdout chunk (state=3): >>><<< 13830 1727204134.98903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204134.98910: handler run complete 13830 1727204134.98927: Evaluated conditional (False): False 13830 1727204134.98934: attempt loop complete, returning result 13830 1727204134.98944: _execute() done 13830 1727204134.98947: dumping result to json 13830 1727204134.98949: done dumping result, returning 13830 1727204134.98968: done running TaskExecutor() for managed-node3/TASK: Remove test interfaces [0affcd87-79f5-1659-6b02-000000000e55] 13830 1727204134.98971: sending task result for task 0affcd87-79f5-1659-6b02-000000000e55 13830 1727204134.99074: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e55 13830 1727204134.99077: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.053617", "end": "2024-09-24 14:55:34.939395", "rc": 0, "start": "2024-09-24 14:55:34.885778" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 13830 1727204134.99278: no more pending results, returning what we have 13830 1727204134.99281: results queue empty 13830 1727204134.99282: checking for any_errors_fatal 13830 1727204134.99291: done checking for any_errors_fatal 13830 1727204134.99292: checking for max_fail_percentage 13830 1727204134.99293: done checking for max_fail_percentage 13830 1727204134.99294: checking to see if all hosts have failed and the running result is not ok 13830 1727204134.99294: done checking to see if all hosts have failed 13830 1727204134.99295: getting the remaining hosts for this loop 13830 1727204134.99297: done getting the remaining hosts for this loop 13830 1727204134.99300: getting the next task for host managed-node3 13830 1727204134.99307: done getting next task for host managed-node3 13830 1727204134.99309: ^ task is: TASK: Stop dnsmasq/radvd services 13830 1727204134.99312: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204134.99316: getting variables 13830 1727204134.99317: in VariableManager get_vars() 13830 1727204134.99375: Calling all_inventory to load vars for managed-node3 13830 1727204134.99378: Calling groups_inventory to load vars for managed-node3 13830 1727204134.99381: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204134.99391: Calling all_plugins_play to load vars for managed-node3 13830 1727204134.99393: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204134.99396: Calling groups_plugins_play to load vars for managed-node3 13830 1727204135.00808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204135.01793: done with get_vars() 13830 1727204135.01811: done getting variables 13830 1727204135.01856: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.482) 0:01:08.096 ***** 13830 1727204135.01884: entering _queue_task() for managed-node3/shell 13830 1727204135.02120: worker is 1 (out of 1 available) 13830 1727204135.02135: exiting _queue_task() for managed-node3/shell 13830 1727204135.02151: done queuing things up, now waiting for results queue to drain 13830 1727204135.02152: waiting for pending results... 13830 1727204135.02422: running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services 13830 1727204135.02532: in run() - task 0affcd87-79f5-1659-6b02-000000000e56 13830 1727204135.02546: variable 'ansible_search_path' from source: unknown 13830 1727204135.02550: variable 'ansible_search_path' from source: unknown 13830 1727204135.02587: calling self._execute() 13830 1727204135.02848: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204135.02853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204135.02855: variable 'omit' from source: magic vars 13830 1727204135.03111: variable 'ansible_distribution_major_version' from source: facts 13830 1727204135.03125: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204135.03128: variable 'omit' from source: magic vars 13830 1727204135.03181: variable 'omit' from source: magic vars 13830 1727204135.03215: variable 'omit' from source: magic vars 13830 1727204135.03256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204135.03293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204135.03312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204135.03328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204135.03340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204135.03370: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204135.03374: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204135.03376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204135.03471: Set connection var ansible_connection to ssh 13830 1727204135.03482: Set connection var ansible_timeout to 10 13830 1727204135.03487: Set connection var ansible_shell_executable to /bin/sh 13830 1727204135.03490: Set connection var ansible_shell_type to sh 13830 1727204135.03496: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204135.03506: Set connection var ansible_pipelining to False 13830 1727204135.03531: variable 'ansible_shell_executable' from source: unknown 13830 1727204135.03536: variable 'ansible_connection' from source: unknown 13830 1727204135.03539: variable 'ansible_module_compression' from source: unknown 13830 1727204135.03542: variable 'ansible_shell_type' from source: unknown 13830 1727204135.03544: variable 'ansible_shell_executable' from source: unknown 13830 1727204135.03547: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204135.03549: variable 'ansible_pipelining' from source: unknown 13830 1727204135.03551: variable 'ansible_timeout' from source: unknown 13830 1727204135.03553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204135.04260: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204135.04289: variable 'omit' from source: magic vars 13830 1727204135.04300: starting attempt loop 13830 1727204135.04307: running the handler 13830 1727204135.04323: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204135.04354: _low_level_execute_command(): starting 13830 1727204135.04370: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204135.04972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.04983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.05013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.05026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.05075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.05088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.05137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.06979: stdout chunk (state=3): >>>/root <<< 13830 1727204135.06987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.07000: stdout chunk (state=3): >>><<< 13830 1727204135.07008: stderr chunk (state=3): >>><<< 13830 1727204135.07031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.07053: _low_level_execute_command(): starting 13830 1727204135.07063: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540 `" && echo ansible-tmp-1727204135.0704067-18700-117307289255540="` echo /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540 `" ) && sleep 0' 13830 1727204135.07709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204135.07726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.07744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.07776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.07817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.07828: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204135.07847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.07867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204135.07881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204135.07892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204135.07904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.07917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.07931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.07946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.07957: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204135.07973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.08053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.08073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.08087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.08162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.09971: stdout chunk (state=3): >>>ansible-tmp-1727204135.0704067-18700-117307289255540=/root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540 <<< 13830 1727204135.10081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.10154: stderr chunk (state=3): >>><<< 13830 1727204135.10158: stdout chunk (state=3): >>><<< 13830 1727204135.10181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204135.0704067-18700-117307289255540=/root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.10216: variable 'ansible_module_compression' from source: unknown 13830 1727204135.10274: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204135.10309: variable 'ansible_facts' from source: unknown 13830 1727204135.10392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540/AnsiballZ_command.py 13830 1727204135.10540: Sending initial data 13830 1727204135.10543: Sent initial data (156 bytes) 13830 1727204135.11503: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204135.11512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.11522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.11538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.11577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.11584: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204135.11595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.11608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204135.11616: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204135.11623: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204135.11630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.11643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.11651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.11659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.11667: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204135.11676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.11759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.11765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.11768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.11850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.13531: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204135.13571: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204135.13608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpy19_4oj1 /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540/AnsiballZ_command.py <<< 13830 1727204135.13642: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204135.14591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.14678: stderr chunk (state=3): >>><<< 13830 1727204135.14682: stdout chunk (state=3): >>><<< 13830 1727204135.14703: done transferring module to remote 13830 1727204135.14715: _low_level_execute_command(): starting 13830 1727204135.14718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540/ /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540/AnsiballZ_command.py && sleep 0' 13830 1727204135.15456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.15462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.15501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.15508: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204135.15517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.15526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204135.15536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.15544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.15550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204135.15555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.15611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.15633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.15640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.15684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.17353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.17559: stderr chunk (state=3): >>><<< 13830 1727204135.17562: stdout chunk (state=3): >>><<< 13830 1727204135.17568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.17571: _low_level_execute_command(): starting 13830 1727204135.17574: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540/AnsiballZ_command.py && sleep 0' 13830 1727204135.18205: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204135.18222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.18237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.18257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.18304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.18327: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204135.18344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.18366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204135.18380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204135.18392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204135.18403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.18423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.18444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.18458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.18472: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204135.18486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.18574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.18595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.18609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.18691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.33838: stdout chunk (state=3): >>> <<< 13830 1727204135.33857: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:35.317166", "end": "2024-09-24 14:55:35.337352", "delta": "0:00:00.020186", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204135.35003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204135.35057: stderr chunk (state=3): >>><<< 13830 1727204135.35060: stdout chunk (state=3): >>><<< 13830 1727204135.35083: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:35.317166", "end": "2024-09-24 14:55:35.337352", "delta": "0:00:00.020186", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204135.35118: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204135.35124: _low_level_execute_command(): starting 13830 1727204135.35129: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204135.0704067-18700-117307289255540/ > /dev/null 2>&1 && sleep 0' 13830 1727204135.35585: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.35588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.35629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.35633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.35637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.35687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.35702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.35780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.37620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.37625: stdout chunk (state=3): >>><<< 13830 1727204135.37628: stderr chunk (state=3): >>><<< 13830 1727204135.37671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.37675: handler run complete 13830 1727204135.38071: Evaluated conditional (False): False 13830 1727204135.38075: attempt loop complete, returning result 13830 1727204135.38078: _execute() done 13830 1727204135.38080: dumping result to json 13830 1727204135.38082: done dumping result, returning 13830 1727204135.38083: done running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services [0affcd87-79f5-1659-6b02-000000000e56] 13830 1727204135.38085: sending task result for task 0affcd87-79f5-1659-6b02-000000000e56 13830 1727204135.38162: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e56 ok: [managed-node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.020186", "end": "2024-09-24 14:55:35.337352", "rc": 0, "start": "2024-09-24 14:55:35.317166" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 13830 1727204135.38224: no more pending results, returning what we have 13830 1727204135.38227: results queue empty 13830 1727204135.38229: checking for any_errors_fatal 13830 1727204135.38237: done checking for any_errors_fatal 13830 1727204135.38238: checking for max_fail_percentage 13830 1727204135.38240: done checking for max_fail_percentage 13830 1727204135.38241: checking to see if all hosts have failed and the running result is not ok 13830 1727204135.38241: done checking to see if all hosts have failed 13830 1727204135.38242: getting the remaining hosts for this loop 13830 1727204135.38244: done getting the remaining hosts for this loop 13830 1727204135.38247: getting the next task for host managed-node3 13830 1727204135.38256: done getting next task for host managed-node3 13830 1727204135.38259: ^ task is: TASK: Check routes and DNS 13830 1727204135.38263: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204135.38273: getting variables 13830 1727204135.38275: in VariableManager get_vars() 13830 1727204135.38320: Calling all_inventory to load vars for managed-node3 13830 1727204135.38322: Calling groups_inventory to load vars for managed-node3 13830 1727204135.38324: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204135.38332: WORKER PROCESS EXITING 13830 1727204135.38349: Calling all_plugins_play to load vars for managed-node3 13830 1727204135.38352: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204135.38355: Calling groups_plugins_play to load vars for managed-node3 13830 1727204135.41056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204135.43013: done with get_vars() 13830 1727204135.43052: done getting variables 13830 1727204135.43134: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.412) 0:01:08.509 ***** 13830 1727204135.43172: entering _queue_task() for managed-node3/shell 13830 1727204135.43543: worker is 1 (out of 1 available) 13830 1727204135.43557: exiting _queue_task() for managed-node3/shell 13830 1727204135.43573: done queuing things up, now waiting for results queue to drain 13830 1727204135.43575: waiting for pending results... 13830 1727204135.43882: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 13830 1727204135.44370: in run() - task 0affcd87-79f5-1659-6b02-000000000e5a 13830 1727204135.44374: variable 'ansible_search_path' from source: unknown 13830 1727204135.44377: variable 'ansible_search_path' from source: unknown 13830 1727204135.44380: calling self._execute() 13830 1727204135.44383: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204135.44385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204135.44387: variable 'omit' from source: magic vars 13830 1727204135.44593: variable 'ansible_distribution_major_version' from source: facts 13830 1727204135.44605: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204135.44611: variable 'omit' from source: magic vars 13830 1727204135.44661: variable 'omit' from source: magic vars 13830 1727204135.44696: variable 'omit' from source: magic vars 13830 1727204135.44743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204135.44775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204135.44800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204135.44969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204135.44973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204135.44976: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204135.44979: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204135.44982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204135.45271: Set connection var ansible_connection to ssh 13830 1727204135.45274: Set connection var ansible_timeout to 10 13830 1727204135.45277: Set connection var ansible_shell_executable to /bin/sh 13830 1727204135.45279: Set connection var ansible_shell_type to sh 13830 1727204135.45281: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204135.45283: Set connection var ansible_pipelining to False 13830 1727204135.45285: variable 'ansible_shell_executable' from source: unknown 13830 1727204135.45287: variable 'ansible_connection' from source: unknown 13830 1727204135.45289: variable 'ansible_module_compression' from source: unknown 13830 1727204135.45891: variable 'ansible_shell_type' from source: unknown 13830 1727204135.45894: variable 'ansible_shell_executable' from source: unknown 13830 1727204135.45898: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204135.45902: variable 'ansible_pipelining' from source: unknown 13830 1727204135.45905: variable 'ansible_timeout' from source: unknown 13830 1727204135.45909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204135.46161: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204135.46175: variable 'omit' from source: magic vars 13830 1727204135.46180: starting attempt loop 13830 1727204135.46183: running the handler 13830 1727204135.46193: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204135.46323: _low_level_execute_command(): starting 13830 1727204135.46332: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204135.48754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.48762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.48807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.48815: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.48832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204135.48838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.48913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.48928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.48939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.49008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.50588: stdout chunk (state=3): >>>/root <<< 13830 1727204135.50762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.50771: stderr chunk (state=3): >>><<< 13830 1727204135.50774: stdout chunk (state=3): >>><<< 13830 1727204135.50805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.50823: _low_level_execute_command(): starting 13830 1727204135.50829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745 `" && echo ansible-tmp-1727204135.508054-18727-198163536319745="` echo /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745 `" ) && sleep 0' 13830 1727204135.52582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.52596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.52668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.52675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.52740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.52768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 13830 1727204135.52774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.53641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.53939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.53942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.54015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.55860: stdout chunk (state=3): >>>ansible-tmp-1727204135.508054-18727-198163536319745=/root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745 <<< 13830 1727204135.56067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.56071: stdout chunk (state=3): >>><<< 13830 1727204135.56074: stderr chunk (state=3): >>><<< 13830 1727204135.56386: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204135.508054-18727-198163536319745=/root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.56390: variable 'ansible_module_compression' from source: unknown 13830 1727204135.56392: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204135.56394: variable 'ansible_facts' from source: unknown 13830 1727204135.56396: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745/AnsiballZ_command.py 13830 1727204135.57127: Sending initial data 13830 1727204135.57130: Sent initial data (155 bytes) 13830 1727204135.58651: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204135.58671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.58686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.58704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.58750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.58763: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204135.58783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.58801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204135.58815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204135.58827: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204135.58843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.58857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.58877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.58891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.58901: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204135.58917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.58999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.59026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.59045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.59121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.60861: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204135.60915: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204135.60918: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmphcquqftc /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745/AnsiballZ_command.py <<< 13830 1727204135.60952: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204135.62431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.62562: stderr chunk (state=3): >>><<< 13830 1727204135.62567: stdout chunk (state=3): >>><<< 13830 1727204135.62569: done transferring module to remote 13830 1727204135.62572: _low_level_execute_command(): starting 13830 1727204135.62574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745/ /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745/AnsiballZ_command.py && sleep 0' 13830 1727204135.64018: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.64051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 13830 1727204135.64059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.64062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.64245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.64303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.64393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.66033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.66109: stderr chunk (state=3): >>><<< 13830 1727204135.66113: stdout chunk (state=3): >>><<< 13830 1727204135.66170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.66174: _low_level_execute_command(): starting 13830 1727204135.66177: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745/AnsiballZ_command.py && sleep 0' 13830 1727204135.67867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204135.67883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.67897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.67915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.67962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.67977: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204135.67991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.68006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204135.68018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204135.68028: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204135.68042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.68055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.68072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.68155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.68168: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204135.68182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.68268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.68296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.68311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.68404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.82332: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3132sec preferred_lft 3132sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:35.813743", "end": "2024-09-24 14:55:35.822005", "delta": "0:00:00.008262", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204135.83515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204135.83519: stdout chunk (state=3): >>><<< 13830 1727204135.83522: stderr chunk (state=3): >>><<< 13830 1727204135.83571: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3132sec preferred_lft 3132sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:35.813743", "end": "2024-09-24 14:55:35.822005", "delta": "0:00:00.008262", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204135.83701: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204135.83704: _low_level_execute_command(): starting 13830 1727204135.83707: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204135.508054-18727-198163536319745/ > /dev/null 2>&1 && sleep 0' 13830 1727204135.85270: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204135.85285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.85297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.85312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.85359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.85374: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204135.85387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.85402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204135.85412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204135.85420: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204135.85430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204135.85448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204135.85466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204135.85480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204135.85492: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204135.85506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204135.85699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204135.85724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204135.85747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204135.85829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204135.87588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204135.87660: stderr chunk (state=3): >>><<< 13830 1727204135.87663: stdout chunk (state=3): >>><<< 13830 1727204135.87872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204135.87876: handler run complete 13830 1727204135.87878: Evaluated conditional (False): False 13830 1727204135.87880: attempt loop complete, returning result 13830 1727204135.87882: _execute() done 13830 1727204135.87884: dumping result to json 13830 1727204135.87886: done dumping result, returning 13830 1727204135.87888: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [0affcd87-79f5-1659-6b02-000000000e5a] 13830 1727204135.87890: sending task result for task 0affcd87-79f5-1659-6b02-000000000e5a 13830 1727204135.87972: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e5a 13830 1727204135.87977: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008262", "end": "2024-09-24 14:55:35.822005", "rc": 0, "start": "2024-09-24 14:55:35.813743" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3132sec preferred_lft 3132sec inet6 fe80::8ff:f5ff:fed7:be93/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 13830 1727204135.88057: no more pending results, returning what we have 13830 1727204135.88062: results queue empty 13830 1727204135.88063: checking for any_errors_fatal 13830 1727204135.88077: done checking for any_errors_fatal 13830 1727204135.88078: checking for max_fail_percentage 13830 1727204135.88080: done checking for max_fail_percentage 13830 1727204135.88081: checking to see if all hosts have failed and the running result is not ok 13830 1727204135.88081: done checking to see if all hosts have failed 13830 1727204135.88082: getting the remaining hosts for this loop 13830 1727204135.88085: done getting the remaining hosts for this loop 13830 1727204135.88090: getting the next task for host managed-node3 13830 1727204135.88100: done getting next task for host managed-node3 13830 1727204135.88103: ^ task is: TASK: Verify DNS and network connectivity 13830 1727204135.88108: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204135.88118: getting variables 13830 1727204135.88120: in VariableManager get_vars() 13830 1727204135.88179: Calling all_inventory to load vars for managed-node3 13830 1727204135.88183: Calling groups_inventory to load vars for managed-node3 13830 1727204135.88185: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204135.88198: Calling all_plugins_play to load vars for managed-node3 13830 1727204135.88201: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204135.88204: Calling groups_plugins_play to load vars for managed-node3 13830 1727204135.90940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204135.94740: done with get_vars() 13830 1727204135.94767: done getting variables 13830 1727204135.94830: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.516) 0:01:09.026 ***** 13830 1727204135.94867: entering _queue_task() for managed-node3/shell 13830 1727204135.95824: worker is 1 (out of 1 available) 13830 1727204135.95838: exiting _queue_task() for managed-node3/shell 13830 1727204135.95850: done queuing things up, now waiting for results queue to drain 13830 1727204135.95852: waiting for pending results... 13830 1727204135.96786: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 13830 1727204135.97018: in run() - task 0affcd87-79f5-1659-6b02-000000000e5b 13830 1727204135.97146: variable 'ansible_search_path' from source: unknown 13830 1727204135.97152: variable 'ansible_search_path' from source: unknown 13830 1727204135.97187: calling self._execute() 13830 1727204135.97406: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204135.97410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204135.97422: variable 'omit' from source: magic vars 13830 1727204135.98244: variable 'ansible_distribution_major_version' from source: facts 13830 1727204135.98258: Evaluated conditional (ansible_distribution_major_version != '6'): True 13830 1727204135.98514: variable 'ansible_facts' from source: unknown 13830 1727204136.10932: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 13830 1727204136.10951: variable 'omit' from source: magic vars 13830 1727204136.11022: variable 'omit' from source: magic vars 13830 1727204136.11070: variable 'omit' from source: magic vars 13830 1727204136.11106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13830 1727204136.11144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13830 1727204136.11168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13830 1727204136.11190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204136.11202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13830 1727204136.11226: variable 'inventory_hostname' from source: host vars for 'managed-node3' 13830 1727204136.11233: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204136.11246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204136.11342: Set connection var ansible_connection to ssh 13830 1727204136.11364: Set connection var ansible_timeout to 10 13830 1727204136.11377: Set connection var ansible_shell_executable to /bin/sh 13830 1727204136.11383: Set connection var ansible_shell_type to sh 13830 1727204136.11392: Set connection var ansible_module_compression to ZIP_DEFLATED 13830 1727204136.11406: Set connection var ansible_pipelining to False 13830 1727204136.11433: variable 'ansible_shell_executable' from source: unknown 13830 1727204136.11443: variable 'ansible_connection' from source: unknown 13830 1727204136.11449: variable 'ansible_module_compression' from source: unknown 13830 1727204136.11455: variable 'ansible_shell_type' from source: unknown 13830 1727204136.11463: variable 'ansible_shell_executable' from source: unknown 13830 1727204136.11481: variable 'ansible_host' from source: host vars for 'managed-node3' 13830 1727204136.11495: variable 'ansible_pipelining' from source: unknown 13830 1727204136.11598: variable 'ansible_timeout' from source: unknown 13830 1727204136.11608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 13830 1727204136.11727: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204136.11746: variable 'omit' from source: magic vars 13830 1727204136.11754: starting attempt loop 13830 1727204136.11760: running the handler 13830 1727204136.11776: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 13830 1727204136.11794: _low_level_execute_command(): starting 13830 1727204136.11803: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13830 1727204136.13640: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204136.13661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.13682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.13699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.13747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.13759: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204136.13775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.13794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204136.13805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204136.13818: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204136.13830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.13847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.13868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.13880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.13894: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204136.13907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.13996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204136.14023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204136.14042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204136.14117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204136.15727: stdout chunk (state=3): >>>/root <<< 13830 1727204136.15931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204136.15937: stdout chunk (state=3): >>><<< 13830 1727204136.15940: stderr chunk (state=3): >>><<< 13830 1727204136.16076: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204136.16087: _low_level_execute_command(): starting 13830 1727204136.16090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801 `" && echo ansible-tmp-1727204136.1597443-18748-51691329862801="` echo /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801 `" ) && sleep 0' 13830 1727204136.16747: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204136.16762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.16784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.16805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.16866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.16886: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204136.16901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.16926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204136.16951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204136.16967: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204136.16981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.16996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.17012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.17025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.17046: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204136.17073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.17165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204136.17186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204136.17202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204136.17288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204136.19114: stdout chunk (state=3): >>>ansible-tmp-1727204136.1597443-18748-51691329862801=/root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801 <<< 13830 1727204136.19305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204136.19309: stdout chunk (state=3): >>><<< 13830 1727204136.19316: stderr chunk (state=3): >>><<< 13830 1727204136.19338: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204136.1597443-18748-51691329862801=/root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204136.19363: variable 'ansible_module_compression' from source: unknown 13830 1727204136.19415: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13830nap5ijvl/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13830 1727204136.19448: variable 'ansible_facts' from source: unknown 13830 1727204136.19539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801/AnsiballZ_command.py 13830 1727204136.19677: Sending initial data 13830 1727204136.19680: Sent initial data (155 bytes) 13830 1727204136.20605: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204136.20614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.20624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.20639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.20682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.20688: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204136.20697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.20711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204136.20718: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204136.20725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204136.20733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.20742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.20753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.20760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.20778: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204136.20787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.20858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204136.20889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204136.20893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204136.20959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204136.22651: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13830 1727204136.22681: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 13830 1727204136.22720: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13830nap5ijvl/tmpq4owv3a8 /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801/AnsiballZ_command.py <<< 13830 1727204136.22758: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 13830 1727204136.23868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204136.23952: stderr chunk (state=3): >>><<< 13830 1727204136.23955: stdout chunk (state=3): >>><<< 13830 1727204136.23979: done transferring module to remote 13830 1727204136.23988: _low_level_execute_command(): starting 13830 1727204136.23993: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801/ /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801/AnsiballZ_command.py && sleep 0' 13830 1727204136.24642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204136.24651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.24660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.24678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.24716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.24723: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204136.24733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.24750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204136.24757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204136.24764: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204136.24778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.24787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.24798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.24805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.24812: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204136.24822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.24899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204136.24916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204136.24927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204136.24997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204136.26784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204136.26789: stdout chunk (state=3): >>><<< 13830 1727204136.26793: stderr chunk (state=3): >>><<< 13830 1727204136.26820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204136.26824: _low_level_execute_command(): starting 13830 1727204136.26827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801/AnsiballZ_command.py && sleep 0' 13830 1727204136.27750: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204136.27763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.27776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.27791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.27830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.27841: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204136.27884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.27888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204136.27891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204136.27893: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204136.27898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.27907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.27919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.27926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.27932: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204136.27945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.28029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204136.28050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204136.28068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204136.28171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204136.49008: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 9838 0 --:--:-- --:--:-- --:--:-- 10166\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 12652 0 --:--:-- --:--:-- --:--:-- 12652", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:55:36.411474", "end": "2024-09-24 14:55:36.488574", "delta": "0:00:00.077100", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13830 1727204136.50273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 13830 1727204136.50342: stderr chunk (state=3): >>><<< 13830 1727204136.50346: stdout chunk (state=3): >>><<< 13830 1727204136.50373: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 9838 0 --:--:-- --:--:-- --:--:-- 10166\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 12652 0 --:--:-- --:--:-- --:--:-- 12652", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:55:36.411474", "end": "2024-09-24 14:55:36.488574", "delta": "0:00:00.077100", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 13830 1727204136.50417: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13830 1727204136.50425: _low_level_execute_command(): starting 13830 1727204136.50430: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204136.1597443-18748-51691329862801/ > /dev/null 2>&1 && sleep 0' 13830 1727204136.51096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13830 1727204136.51105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.51115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.51129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.51172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.51179: stderr chunk (state=3): >>>debug2: match not found <<< 13830 1727204136.51189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.51203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13830 1727204136.51210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 13830 1727204136.51218: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13830 1727204136.51226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13830 1727204136.51235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13830 1727204136.51250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13830 1727204136.51257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 13830 1727204136.51270: stderr chunk (state=3): >>>debug2: match found <<< 13830 1727204136.51276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13830 1727204136.51352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 13830 1727204136.51379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13830 1727204136.51383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13830 1727204136.51451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13830 1727204136.53203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13830 1727204136.53305: stderr chunk (state=3): >>><<< 13830 1727204136.53309: stdout chunk (state=3): >>><<< 13830 1727204136.53330: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13830 1727204136.53340: handler run complete 13830 1727204136.53363: Evaluated conditional (False): False 13830 1727204136.53375: attempt loop complete, returning result 13830 1727204136.53378: _execute() done 13830 1727204136.53380: dumping result to json 13830 1727204136.53386: done dumping result, returning 13830 1727204136.53394: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [0affcd87-79f5-1659-6b02-000000000e5b] 13830 1727204136.53398: sending task result for task 0affcd87-79f5-1659-6b02-000000000e5b 13830 1727204136.53513: done sending task result for task 0affcd87-79f5-1659-6b02-000000000e5b 13830 1727204136.53515: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.077100", "end": "2024-09-24 14:55:36.488574", "rc": 0, "start": "2024-09-24 14:55:36.411474" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 9838 0 --:--:-- --:--:-- --:--:-- 10166 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 12652 0 --:--:-- --:--:-- --:--:-- 12652 13830 1727204136.53585: no more pending results, returning what we have 13830 1727204136.53589: results queue empty 13830 1727204136.53590: checking for any_errors_fatal 13830 1727204136.53599: done checking for any_errors_fatal 13830 1727204136.53599: checking for max_fail_percentage 13830 1727204136.53601: done checking for max_fail_percentage 13830 1727204136.53602: checking to see if all hosts have failed and the running result is not ok 13830 1727204136.53603: done checking to see if all hosts have failed 13830 1727204136.53603: getting the remaining hosts for this loop 13830 1727204136.53605: done getting the remaining hosts for this loop 13830 1727204136.53609: getting the next task for host managed-node3 13830 1727204136.53619: done getting next task for host managed-node3 13830 1727204136.53622: ^ task is: TASK: meta (flush_handlers) 13830 1727204136.53625: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204136.53630: getting variables 13830 1727204136.53632: in VariableManager get_vars() 13830 1727204136.53683: Calling all_inventory to load vars for managed-node3 13830 1727204136.53686: Calling groups_inventory to load vars for managed-node3 13830 1727204136.53688: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204136.53698: Calling all_plugins_play to load vars for managed-node3 13830 1727204136.53700: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204136.53703: Calling groups_plugins_play to load vars for managed-node3 13830 1727204136.60952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204136.62738: done with get_vars() 13830 1727204136.62770: done getting variables 13830 1727204136.62842: in VariableManager get_vars() 13830 1727204136.62862: Calling all_inventory to load vars for managed-node3 13830 1727204136.62867: Calling groups_inventory to load vars for managed-node3 13830 1727204136.62870: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204136.62875: Calling all_plugins_play to load vars for managed-node3 13830 1727204136.62878: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204136.62881: Calling groups_plugins_play to load vars for managed-node3 13830 1727204136.64089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204136.64997: done with get_vars() 13830 1727204136.65016: done queuing things up, now waiting for results queue to drain 13830 1727204136.65018: results queue empty 13830 1727204136.65019: checking for any_errors_fatal 13830 1727204136.65022: done checking for any_errors_fatal 13830 1727204136.65022: checking for max_fail_percentage 13830 1727204136.65023: done checking for max_fail_percentage 13830 1727204136.65023: checking to see if all hosts have failed and the running result is not ok 13830 1727204136.65024: done checking to see if all hosts have failed 13830 1727204136.65024: getting the remaining hosts for this loop 13830 1727204136.65025: done getting the remaining hosts for this loop 13830 1727204136.65027: getting the next task for host managed-node3 13830 1727204136.65030: done getting next task for host managed-node3 13830 1727204136.65031: ^ task is: TASK: meta (flush_handlers) 13830 1727204136.65032: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204136.65038: getting variables 13830 1727204136.65039: in VariableManager get_vars() 13830 1727204136.65051: Calling all_inventory to load vars for managed-node3 13830 1727204136.65052: Calling groups_inventory to load vars for managed-node3 13830 1727204136.65053: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204136.65057: Calling all_plugins_play to load vars for managed-node3 13830 1727204136.65059: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204136.65060: Calling groups_plugins_play to load vars for managed-node3 13830 1727204136.66053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204136.67894: done with get_vars() 13830 1727204136.67918: done getting variables 13830 1727204136.67958: in VariableManager get_vars() 13830 1727204136.67975: Calling all_inventory to load vars for managed-node3 13830 1727204136.67977: Calling groups_inventory to load vars for managed-node3 13830 1727204136.67978: Calling all_plugins_inventory to load vars for managed-node3 13830 1727204136.67982: Calling all_plugins_play to load vars for managed-node3 13830 1727204136.67984: Calling groups_plugins_inventory to load vars for managed-node3 13830 1727204136.67987: Calling groups_plugins_play to load vars for managed-node3 13830 1727204136.68751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13830 1727204136.69777: done with get_vars() 13830 1727204136.69804: done queuing things up, now waiting for results queue to drain 13830 1727204136.69807: results queue empty 13830 1727204136.69808: checking for any_errors_fatal 13830 1727204136.69809: done checking for any_errors_fatal 13830 1727204136.69810: checking for max_fail_percentage 13830 1727204136.69811: done checking for max_fail_percentage 13830 1727204136.69811: checking to see if all hosts have failed and the running result is not ok 13830 1727204136.69812: done checking to see if all hosts have failed 13830 1727204136.69813: getting the remaining hosts for this loop 13830 1727204136.69814: done getting the remaining hosts for this loop 13830 1727204136.69817: getting the next task for host managed-node3 13830 1727204136.69820: done getting next task for host managed-node3 13830 1727204136.69821: ^ task is: None 13830 1727204136.69823: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13830 1727204136.69824: done queuing things up, now waiting for results queue to drain 13830 1727204136.69825: results queue empty 13830 1727204136.69825: checking for any_errors_fatal 13830 1727204136.69826: done checking for any_errors_fatal 13830 1727204136.69827: checking for max_fail_percentage 13830 1727204136.69828: done checking for max_fail_percentage 13830 1727204136.69828: checking to see if all hosts have failed and the running result is not ok 13830 1727204136.69829: done checking to see if all hosts have failed 13830 1727204136.69831: getting the next task for host managed-node3 13830 1727204136.69836: done getting next task for host managed-node3 13830 1727204136.69837: ^ task is: None 13830 1727204136.69839: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=148 changed=4 unreachable=0 failed=0 skipped=97 rescued=0 ignored=0 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.750) 0:01:09.777 ***** =============================================================================== ** TEST check bond settings --------------------------------------------- 5.89s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 ** TEST check bond settings --------------------------------------------- 2.21s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Gathering Facts --------------------------------------------------------- 1.89s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.63s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Create test interfaces -------------------------------------------------- 1.59s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which services are running ---- 1.56s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install dnsmasq --------------------------------------------------------- 1.31s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Install dnsmasq --------------------------------------------------------- 1.27s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Install pgrep, sysctl --------------------------------------------------- 1.26s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.26s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install pgrep, sysctl --------------------------------------------------- 1.23s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.20s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.08s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.97s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 13830 1727204136.69957: RUNNING CLEANUP